clock menu more-arrow no yes mobile

Filed under:

Why we pretend to know things, explained by a cognitive scientist

New research explains why we pretend to know more than we do.

Shutterstock

Why do people pretend to know things? Why does confidence so often scale with ignorance? Steven Sloman, a professor of cognitive science at Brown University, has some compelling answers to these questions.

“We're biased to preserve our sense of rightness,” he told me, “and we have to be.”

The author of The Knowledge Illusion: Why We Never Think Alone, Sloman’s research focuses on judgment, decision-making, and reasoning. He’s especially interested in what’s called “the illusion of explanatory depth.” This is how cognitive scientists refer to our tendency to overestimate our understanding of how the world works.

We do this, Sloman says, because of our reliance on other minds.

“The decisions we make, the attitudes we form, the judgments we make, depend very much on what other people are thinking,” he said.

If the people around us are wrong about something, there’s a good chance we will be too. Proximity to truth compounds in the same way.

In this interview, Sloman and I talk about the problem of unjustified belief. I ask him about the political implications of his research, and if he thinks the rise of “fake news” and “alternative facts” has amplified our cognitive biases.

This conversation has been lightly edited for length and clarity.


Sean Illing

How do people form opinions?

Steven Sloman

I really do believe that our attitudes are shaped much more by our social groups than they are by facts on the ground. We are not great reasoners. Most people don't like to think at all, or like to think as little as possible. And by most, I mean roughly 70 percent of the population. Even the rest seem to devote a lot of their resources to justifying beliefs that they want to hold, as opposed to forming credible beliefs based only on fact.

Think about if you were to utter a fact that contradicted the opinions of the majority of those in your social group. You pay a price for that. If I said I voted for Trump, most of my academic colleagues would think I'm crazy. They wouldn't want to talk to me. That's how social pressure influences our epistemological commitments, and it often does it in imperceptible ways.

Sean Illing

This is another way of saying that we live in a community of knowledge.

Steven Sloman

That's right. I believe every thought we have depends on thoughts that other people are having. When I cross the street, my actions depend on the thoughts that are going through the mind of the driver's head. If I get on the bus, the success of my endeavor depends on the thoughts that are going on in the bus driver's head.

When I express an attitude about immigration, what am I really doing? What do I really know about immigration? I live in a very limited universe, and so I have to depend on the beliefs and knowledge of other people. I know what I’ve read; I know what I’ve heard from experts. I don’t have any direct experience of the immigration problem; I haven’t visited the border and studied it myself.

In that sense, the decisions we make, the attitudes we form, the judgments we make, depend very much on what other people are thinking.

Sean Illing

There are some obvious dangers here, right?

Steven Sloman

One danger is that if I think I understand because the people around me think they understand, and the people around me all think they understand because the people around them all think they understand, then it turns out we can all have this strong sense of understanding even though no one really has any idea what they're talking about.

Sean Illing

I’m trying to think about all of this in terms of our political circumstances. Most of us don’t understand as much as we think, and yet we’re all cocksure about a range of issues. So when we are arguing about politics, what are we really arguing about? Is it about getting it right or is it about preserving our sense of rightness?

Steven Sloman

I'm not sure there's a sharp distinction between wanting to get it right and wanting to preserve our sense of rightness. In the political domain, like most domains in which we don't just hear or see what's true, we rely on social consensus. So argument is about trying to convince others while we're trying to convince ourselves. Getting it right essentially means we're convinced.

Of course, we're biased to preserve our sense of rightness, but we have to be. If we weren't, we'd be starting again each time we approached an issue; our previous arguments would be for naught.

Nevertheless, people differ on this. Everyone has a compulsion to be right, meaning that they want the people around them to think they're right, and this is easily achieved by mouthing the things that the people around you say. And people who are more capable tend to be better at finding ways to interpret new facts in line with their community's preconceptions.

But some people do try to rise above the crowd: to verify claims independently, to give fair hearing to others' claims, and to follow the data where it actually leads. In fact, many people are trained to do that: scientists, judges, forensic investigators, physicians, etc. That doesn't mean they always do (and they don't always), just that they're supposed to try.

I like to live in communities that put a premium on getting things right even when they fly in the face of social norms. This means living with constant tension, but it's worth it.

Sean Illing

This phenomenon, the “illusion of explanatory depth,” applies equally to people on the left and the right. This isn’t a partisan problem; it’s a human problem.

Steven Sloman

That's exactly right, and our data shows this clearly.

Sean Illing

How do you collect that data? What sorts of experiments have you done to tease out these tendencies?

Steven Sloman

I run experiments in my lab and over the internet. We try to find representative groups of Americans and ask them questions, mostly hypothetical questions. In the case of the illusion with political policies, we ask people to rate their attitude and their understanding of a policy, then ask them to explain the policy (what it is and how it would lead to specific consequences), and then they rate their own understanding and their attitude again. We find that the attempt to explain reduces their sense of understanding and also makes their attitude less extreme, on average.

Sean Illing

Is it necessarily a bad thing that we rely on other people for much of our knowledge? Is this just a response to our bandwidth problem? There’s only so much we can know and learn for ourselves, so it seems we have little choice but to function as we do.

Steven Sloman

I think it's necessary. We have no choice. There's no way one head can master everything, so we have to depend on other people. This is a perfectly rational response to our condition. However, we don't have to live in an illusion. If we don't understand, we don't have to think that we understand. But I get that some people have to believe that they understand in order to get through life.

The problem is that too often our beliefs support ideas or policies that are totally unjustified.

Sean Illing

This is where we careen into troubling territory for me. Ignorance and confidence is a lethal cocktail. When someone's confidence scales with their ignorance, there's really no way to engage them.

Steven Sloman

That's absolutely right. This is a very dangerous kind of hubris. And our president is really example number one of this. But we have to think about the community that made this administration possible. As much as I hate hearing Trump lie, I get even crazier when I see that 44 percent of the American population put more faith in his words than in the mainstream media.

That's what makes me crazy, because that's what gives him his power.

[Author’s note: a recent latest Quinnipiac Poll indicates that 37 percent of Americans trust Trump more than the media.]

Sean Illing

Well this is why the preponderance of "fake news" and "alternative facts" is so pernicious.

Steven Sloman

Absolutely. I worry about this on the right and, as you suggested, I worry about this on the left as well.

Sean Illing

Is there any evidence to suggest that we’re getting better at reasoning? Are we gradually overcoming our cognitive biases?

Steven Sloman

My reaction to this is so different than it would've been eight months ago.

Sean Illing

I’m guessing you think the internet and our balkanized media landscape has made things worse.

Steven Sloman

It's very clear that we're even more in a bubble than we ever were. I've been shocked by how little I know about half the American population. I just can't get my head around the way they think at all. And that hasn't changed. Even though I make an effort, it's still the case that everyone around me sees things the way I do, and I'm sure nearly everyone in Grand Rapids, Michigan, sees things differently. But I don't talk to those people.

The internet is clearly making it worse in the sense that we can reach out and form these online communities of fellow believers. And the fact that our news is getting individualized makes it much worse. So, even if I want to understand what the other side sees, Google is constantly feeding me the things I want to see.

And that's bad for all of us.

Sean Illing

So do you have any thoughts in terms of practical solutions to this? How can we cultivate more self-awareness and less biased reasoning? How can we seek out wiser communities of knowledge?

Steven Sloman

People who are more reflective are less susceptible to the illusion. There are some simple questions you can use to measure reflectivity. They tend to have this form: How many animals of each kind did Moses load onto the ark? Most people say two, but more reflective people say zero. (It was Noah, not Moses who built the ark.)

The trick is to not only come to a conclusion, but to verify that conclusion. There are many communities that encourage verification (e.g., scientific, forensic, medical, judicial communities). You just need one person to say, "are you sure?" and for everyone else to care about the justification. There's no reason that every community could not adopt these kinds of norms. The problem of course is that there's a strong compulsion to make people feel good by telling them what they want to hear, and for everyone to agree. That's largely what gives us a sense of identity. There's a strong tension here.

My colleagues and I are studying whether one way to open up discourse is to try to change the nature of conversation from a focus on what people value to one about actual consequences. When you talk about actual consequences, you're forced into the weeds of what's actually happening, which is a diversion from our normal focus on our feelings and what's going on in our heads.

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.