The Illusion of Explanatory Depth

I am utterly fascinated by the concept of metacognition — how we evaluate and measure our own skills and abilities. This is the main idea underlying the Dunning-Kruger curve, the tendency for those with lower levels of  competency to wildly over-estimate their own skill levels.

This impacts everything and everyone, from policymakers to entrepreneurs to most especially, investors.

There is a related concept I referenced earlier this month called “The Illusion of Explanatory Depth.” It is the intriguing idea that we think we actually know much more about things — and can easily explain them — when in fact we do not.

We know how to get the information; we know who to call to fix a toilet when it breaks — but (most of us) do not ourselves actually know how these things work or (and this is the key) have the ability to explain them.

Stephen Dubner discussed the idea of Illusion of Explanatory Depth in a recent episode of Freakonomics: “Think of something you have a really strong opinion about. . . Now think about why you have such a strong opinion. How well do you think could you explain your position?”

If you want an even simpler example, could you explain how a toilet or zipper or ballpoint pen or bicycles work? Can you describe how pencils are manufactured?  You probably think you understand these things, but once you try to explain the specifics, you might find you falter and understand less than you previously imagined.

Here is professor Steven SLOMAN, a cognitive psychologist at Brown University:

If you’re forced to give an explanation, you have to really understand, and you have to confront the fact that you might not understand. Whereas when you give reasons, then you do what people do around the Thanksgiving dinner table. They talk about their feelings about it, what they like, what they don’t like.”

This is why Nobel winning physicist Richard Feynman suggested “If you want to master something, teach it.” Dr. Mortimer J. Adler, founder of the Institute for Philosophical Research, was even more blunt: “The person who says he knows what he thinks but cannot express it usually does not know what he thinks.”

The illusion allows people to believe they know more than they do. I suspect the reason for this can be described as being “Knowledge Adjacent.” Feynman also observed “I learned very early the difference between knowing the name of something and knowing something.”

Consider how often we do not know the answer to specific detailed questions — for example about how things like zippers or ballpoint pens work, but we do know how to find those answers. Knowledge Adjacency seems to be is a misleading form as a form of intelligence and understanding. We confuse what our community collectively knows with our own individual understanding.

This leads to a form of misperception, that creates a false belief that we know more than they actually do.

Now apply this to social media and the internet.

We are surrounded by other people, and dependent upon their abilities and expertise. This is true whether it is an Uber driver or a medical doctor or home builder. We enjoy massive technological advancements, not because we can build an iPhone or an index the internet ourselves, but because we know how to Google and where to buy a phone.

Knowledge is distributed widely in our society. We err when we believe this is our own understanding.

 

 

 

Sources:
Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments.
Justin Kruger and David Dunning, Cornell University
Journal of Personality and Social Psychology, 77(6), 1121-1134  (December 1999)
https://www.ncbi.nlm.nih.gov/pubmed/10626367

How to Change Your Mind (Ep. 379)
Stephen J. Dubner
Freakonomics, May 29, 2019
freakonomics.com/podcast/change-your-mind/

 

Previously:
How to Change Your Mind (August 7, 2019)

 

 

More of the Dubner/Sloman discussion after the jump…

 

As you can see, there are a lot of reasons why a given person might be reluctant to change their mind about a given thing. Ego, selective memory, overconfidence, the cost of losing family or friends. But let’s say you remain committed to changing minds — your own or someone else’s. How do you get that done? The secret may lie not in a grand theoretical framework, but in small, mundane objects:

SLOMAN: Toilets and zippers and ballpoint pens.

*     *     *

Think of something you have a really strong opinion about. Maybe the best ways to address climate change. The perils of income inequality. How to balance privacy and security. Now think about why you have such a strong opinion. How well do you think could you explain your position?

Steven SLOMAN: If you’re forced to give an explanation, you have to really understand, and you have to confront the fact that you might not understand. Whereas when you give reasons, then you do what people do around the Thanksgiving dinner table. They talk about their feelings about it, what they like, what they don’t like.

That’s Steven Sloman.

SLOMAN: I’m a professor of cognitive, linguistic, and psychological sciences at Brown University.

DUBNER: And that means, in a nutshell, that you try to understand what?

SLOMAN: I try to understand how people think.

DUBNER: Easy question first: How do you get someone to change their mind?

SLOMAN: Well, first of all, there’s no silver bullet. It’s really hard. But if you’re going to try, the first thing you should do is try to get them to change their own minds. And you do that by simply asking them to assume your perspective and explain why you might be right. If you can get people to step outside themselves and think about the issue — not even necessarily from your perspective, but from an objective perspective, from one that is detached from their own interests — people learn a lot. So, given how hard it is for people to assume other people’s perspectives, you can see why I started my answer by saying it’s very hard.

One experiment Sloman has done is asking people to explain — not reason, as he pointed out, but to actually explain, at the nuts-and-bolts level — how something works.

SLOMAN: People don’t really like to engage in the kind of mechanistic analysis required for a causal explanation.

That’s true not only for big, thorny issues like climate change or income inequality, but even for things like:

SLOMAN: Toilets and zippers and ballpoint pens.

Unless you are a plumber or you make zippers or ballpoint pens, you probably can’t explain these very well. Even though, before you were asked the question, you would have thought you could. This gap, between what you know and what you think you know is called, naturally, the “illusion of explanatory depth.”

SLOMAN: So, the illusion of explanatory depth was first demonstrated by a couple of psychologists named Rozenblit and Keil. And they asked people how well they understood how these things worked, and people gave a number between 1 and 7. And then they said, “Okay, how does it work? Explain in as much detail as you can how it works.” And people struggled and struggled and realized they couldn’t. And so when they were again asked how well they understood, their judgments tended to be lower. In other words, people themselves admitted that they had been living in this illusion, that they understood how these things worked, when, in fact, they don’t.

Where does this illusion come from?

SLOMAN: We think the source of the illusion is that people fail to distinguish what they know from what others know. We’re constantly depending on other people, and the actual processing that goes on is distributed among people in our community.

In other words, someone knows how a toilet works: the plumber. And you know the plumber; or, even if you don’t know the plumber, you know how to find a plumber.

SLOMAN: It’s as if the sense of understanding is contagious. When other people understand, you feel like you understand.

You can see how the illusion of explanatory depth could be helpful in some scenarios — you don’t need to know everything for yourself, as long as you know someone who knows someone who knows something. But you could also imagine scenarios in which the illusion could be problematic.

SLOMAN: So we’ve shown that that’s also true in the political domain.

Sloman and his collaborator Philip Fernbach basically repeated the Rozenblit and Keil experiment, but instead of toilets and zippers, they asked people about climate change and gun control.

SLOMAN: We gave people political policies. We said, “How well do you understand them?” and “Please explain them.”

Unsurprisingly, most people were not able to explain climate-change policies in much detail. But here’s what’s interesting. The level of confidence in their understanding of issues, which participants were asked to report at the start of the experiment, was drastically reduced after they tried, and failed, to demonstrate their understanding.

SLOMAN: In other words, asking people to explain depolarized the group.

Now, was this a case of simply slowing down and thinking the issue through? Could it be that we’re often inflexible in our thinking simply because we come to conclusions too quickly? Apparently not.

SLOMAN: If instead of saying, “Explain how the policy works,” if what we said to them was, “Give us all the reasons you have for your view on this policy,” then we didn’t get that effect at all. That didn’t reduce people’s sense of understanding; it didn’t reduce their hubris.

DUBNER: The ability to change your mind — would you say that’s really important as a human?

SLOMAN: I see the mind as something that’s shared with other people. I think the mind is actually something that exists within a community and not within a skull. And so, when you’re changing your mind you’re doing one of two things: you’re either dissociating yourself from your community — and that’s really hard and not necessarily good for you — or you have to change the mind of the entire community. And is that important? Well, the closer we are to truth, the more likely we are to succeed as individuals, as a species. But it’s hard.

DUBNER: Do you think that most of us hold the beliefs that we do because the people around us hold those beliefs, or do you think we’re more likely to assemble people around us based on the beliefs that they and we hold?

SLOMAN: The former is more often true. That is, we believe what we do because the people around us believe what they do. This is the way humanity evolved. We depend on other people. And it’s not simply a matter of getting us to think more independently. I actually think that this is one of the major problems with the kinds of solutions people are talking about today for our current political problems. I don’t think the solution is give people the information they need.

Matthew JACKSON: More information can be good if it’s very well-filtered and curated, but that’s not easy to do in an unbiased way.

 

 

Print Friendly, PDF & Email

Posted Under