The ‘illusion of explanatory depth’ is a cognitive bias that leads people to believe that they have a far greater understanding of complex issues, mechanisms and devices than they actually do until they are required to explain what they know.
In a study conducted in 2001, the psychologists Leonid Rozenblit and Frank Keil asked graduate students from Yale to rate their understanding of how 48 different devices work; the device list included everything from a can opener to the gearbox of a car to a ballpoint pen. After the students had rated themselves, they were required to complete a series of tasks on a small number of the devices; at the end of each task, they were asked to re-rate their level of understanding. One of the chosen items was a cylinder lock.
The three main tasks are below. Remember, after completing each one, the participants had to re-rate their level of understanding.
After completing each of the first two tasks, the student ratings of their understanding declined. Rozenblit Keil highlight in the paper that ‘there were large and significant drops in participants’ estimates of their knowledge as a result of trying to explain the devices.’ However, perhaps surprisingly, their ratings then increased after completing the third task. And this final rise is important because it suggests that the students weren’t simply losing confidence as the experiment progressed. Rather, it seems that they gained knowledge and insight as a result of engaging with the expert explanation.
The challenge of explaining things helps us to assess what we know and don’t know with greater accuracy. And whilst this can sometimes be an uncomfortable and humbling experience, it’s also a vital one because it fosters a mindset of productive humility. For example, I like to think that I have a fairly good understanding of dual-coding theory. But, rationally, I know that I don’t. If you asked me to explain anything beyond the headline details – and prevented me from being able to do some reading beforehand – I’d struggle. In fact, you’d probably discover that I know as much about dual-coding as I do about cylinder locks.
Blithely assuming that we know our stuff or simply trusting that others do will never be good enough because doing so comes at a cost. For example, think of the lethal mutations that arose from misguided interpretations of the Assessment for Learning research all those years ago. And now think of some current mutations; there are quite a few to choose from (the misapplication of dual-coding theory being just one of them). Good intentions and superficial knowledge can be a toxic combination, and even more so when groupthink and confirmation bias are involved.
So, stay humble and stay curious. And challenge those around you to do the same by asking them to explain exactly what they mean when they say something is important or interesting or great or awful, and so on.
Thanks for reading –
Doug