Leeds Business Insights Season 1 Episode #7
S1E7: Professor Phil Fernbach - Shattering the Knowledge Illusion
[00:00:00] Amanda: Welcome to the Leeds Business Insights podcast, featuring expert analysis to help you stand out from the herd. My name is Amanda Kramer. We are so thrilled to have Phil Fernbach here today. Phil Fernbach is a professor of marketing in the Leeds School of Business at the University of Colorado Boulder. He is a cognitive scientist who studies how people think, and he applies insights from his research to improve public disclosure and help consumers and managers make better decisions. He is co-author, with Steven Sloman, of the 2017 book, The Knowledge Illusion: Why We Never Think Alone, which was chosen as an editor's pick by the New York Times. The book explores why we think we know so much more than we do and the profound implications for individuals and society.
Today, we are going to be talking about The Knowledge Illusion. We are living in an age of fake news and science denialism that has polarized discussions around vaccinations, climate change, genetically modified food, on and on. Is the answer that you're right and everyone who doesn’t agree with you is stupid? Or is it possible that you're not a molecular biologist, and you are oversimplifying highly complex studies and genetic engineering?
Today's guest on Leeds Business Insights is Phil Fernbach, an expert on overconfidence and what he calls The Knowledge Illusion. Here to shed some light on how people overestimate their understanding of the world and its consequences that has for us.
Welcome, Phil, and thank you so much for being here.
[00:01:33] Phil: Thanks for having me it’s a pleasure.
[00:01:35] Amanda: Great. We're really interested in learning more. So, Phil, your research largely centers around The Knowledge Illusion. Tell us more about this illusion.
[00:01:45] Phil: The Knowledge Illusion is a very profound fact about human beings. This was actually phenomenon that was first studied in the cognitive science world, in the nineties, by a researcher by the name of Frank Keil. And in his studies, he brought people into the lab, and he asked them about their understanding of sort of common household objects, like zippers and toilets and ballpoint pens. The first thing he would do would be he’d ask people how well they understood those objects. Ask yourself how well do you understand how a toilet works. And his subjects and most people, when we've studied the same things, sort of nod their heads and say, “Oh yeah, I know how that works.” I have a decent understanding of the mechanisms. In the next phase of the study, what he would do is ask people to explain in detail exactly how the object works, and what he found was pretty amazing. He found that people, in general, know remarkably little about the way the world works. They reach inside and try to explain these phenomena, and they realize that they have almost nothing to say, maybe one or two sentences. And yet, that feeling they have at the beginning, that they understand these things in a lot more depth than they do. The disconnect between those two things is what’s called in the cognitive science world “The Illusion of Explanatory Depth” it’s people's belief that they can explain things in more depth than they can. I got really interested in that phenomenon because me and my co-author Steve Sloman we, realized that this wasn't really about toilets and ballpoint pens. But it was actually about a lot of things. So really pervasive phenomenon that we tend to overestimate our understanding of things. And we started studying this in other areas where the implications are really important, like, for instance, people's political beliefs. Or their beliefs about some of the issues you mentioned before, like, science-based issues and so on.
[00:03:30] Amanda: Thank you very much for that. Now that you've explained to us what The Knowledge Illusion is, where did you go from there? Where did you take this information?
[00:03:39] Phil: Starting with this basic idea that people overestimate their understanding of the world, we sort of asked two big questions. The first one is and that gets into the psychology of the way that the mind works. I'm a cognitive scientist by training, as is Sloman, my co-author. So if you look at our book, the first half of the book is really a book of cognitive science, which is trying to dissect and understand why this illusion happens. And in order to do so, you have to go pretty deep into human psychology and the psychology of groups as well. Social cognition. The other question that we ask is, “Where does this play out?” And what sort of benefits and drawbacks does this have for human beings and society? And there, we talk about a lot of different areas. I mentioned politics. I mentioned science but also areas like decision-making, education, and conceptions of what it means to be intelligent and how to work together in groups, and all this other kind of stuff. So, the second half of the book is applying the concepts in the first half of the book to sort of these important issues. And reasons we wrote a whole book about this is because as we started peeling back the layers, we just found that there was a lot.
[00:04:43] Amanda: Now that we understand more about the illusion, can you tell us more about the root of this illusion, Phil, to ground us as we move forward and eventually think about implications?
[00:04:54] Phil: Yeah. So, one of the big ideas that we talk about is this notion of ignorance. So, the illusion has two pieces. One is that we don't know very much, which is, you know, kind of interesting in and of itself. And the second is that we think we know more than we do. So, the first piece why are individuals ignorant about the way that the world works? The answer is that human beings, as individuals, we just don't have that much capacity to store complex information about the world. Just store detailed information about the world. The mind is not really built for that. There's just too much complexity in the world and it as individuals. And if we would try to master all of that complexity, we would just fail. It's impossible. There was a great study that we talked about in the book by a guy named Landauer, a psychologist in the ‘80s, who was trying to estimate the memory capacity of human beings on the same scale as you would measure the memory capacity of a computer. And I won't get into the details of how he did this, but it was pretty clever. And he got to an estimate in a few different ways, and all the estimates kind of agreed— and the answer was mind-blowing. It was one gigabyte, which is way less than you would have on a thumb drive that might be in your pocket right now. Human beings just are not built to store a lot of detailed information; we’re built for other things. We're really good at sort of common sense at generalization, at figuring out how to solve problems in new situations. We tend not to store a lot of detailed information. Because, a lot of the times, we don't need that detailed information to get by in the world. I might not know how a toilet works, but I know who to call if it breaks. I might not need to know how it works in a lot of detail. So, that raises the second question, which is why do we overestimate our understanding? Why do we fail to realize that we don't know as much as we think we do? And that is the answer that we come to in the book is pretty deep. And it gets into the subtitle of the book, which is called “Why We Never Think Alone.” And the concept is that human beings were not built as individuals to store a lot of detailed information. What we're built to do is to collaborate in communities of knowledge in groups where every individual has a specialized little bit of knowledge, and we can work together, as a community, to pursue as a sort of arbitrarily complex goals. And so you don't need any individual in the group to know everything. What you need is everybody to know their little piece. And then, as human beings, we need to have cognitive functions, cognitive capacities to be able to collaborate and work together with other human beings. And that seems to be what human beings are so good at. And where does the illusion come from? The illusion comes from the fact that most of what we know or think we know is actually not in our own heads, but it's in the heads of other people or the environment, or on the internet. And so, we often fail to realize because it's so natural for us to rely on information that exists outside of our own heads. We often fail to realize what's in our heads and what’s elsewhere. So if everybody around me is sort of nodding their head and saying, “Oh yeah, we understand this.” We sort get the feeling that we ourselves understand it as well. And that sort of sets them understanding that comes from the people around having the knowledge that we don't have is where we argue the illusion comes from. And you can see why we had to write a whole book about this because the story is not so simple. It's actually a pretty nuanced account of the way the mind works and the way knowledge works across human beings, and so on. But that's the basic idea is that as individuals, we don't know very much because it's impossible for us to know that much about the world. And we rely on others in our communities for a lot of the knowledge that we rely on. And that gives us sort of the feeling that we ourselves understand when we don't sometimes.
[00:08:32] Amanda: That's really interesting, Phil. Especially as we think about the implications for group work, which is incredibly important within the business world. Can you tell us more about the implications for groups within a business context?
[00:08:47] Phil: Yeah, absolutely. Some of the most the deepest implications of these ideas are in terms of the way we should think about how we add value to the world or to a business. And there's a sort of traditional conception of intelligence that we discuss at length in the book. It's this idea called G that stands for general intelligence. And that's traditionally been the way people that think about someone's value. “Oh, they're smart.” And everyone thinks they know what that means. And usually, what that entails is being really good at thinking quickly, solving tricky anagrams, retaining a bunch of information. All that kind of stuff that we traditionally you would see on an intelligence test, but what these ideas suggest to me that maybe it's not so important. That someone has a huge amount of mental horsepower, for instance, or that they know. Or feel like they know tons of information because everything human beings do that matters is, really, by virtue of this ability to collaborate in groups. Where people have different bits of information. And it's really the process of collaboration that creates something great. And peoples, you should really evaluate people's worth, not by how smart they are in the traditional sense, but in terms of how much they help a group to succeed. And there’s a lot of different ways can help a group succeed. And maybe you do want someone on the team who's has tons of mental horsepower and can think quickly, and so on. But you need other types of people who contribute in different ways as well. A good team has complementary sets of skills. And the counterpoint to this is that a good leader should be aware that they themselves don't understand everything about a particular problem. All the problems we work on in business nowadays are complex, and we often—I'm sure your listeners have the experience of running into people all the time— who feel like they know everything. And in fact, they might feel pressure to pretend like they understand everything in detail. That's why people never ask obvious questions in a business setting because they think that they should understand everything in detail and know everything. So a good group is going to have a leader who understands his or her limitations and knows how to put together a team with the complementary skills in a way that functions effectively.
[00:11:01] Amanda: Absolutely. That certainly makes sense. And, Phil, I'm wondering how this might connect to what you talk about in terms of the overestimation of our abilities sometimes? And memory kind of reinforcing that we're better at something than we are? Could you talk to us more about this memory bias as it relates to overconfidence and investing in some work that you've recently completed?
[00:11:27] Phil: Sure. Yeah, I'd be happy to. That’s a paper that just came out a month or two ago. And it's in the very important area of investing. And we know, from a lot of studies, that investors tend to be overconfident. And that's why oftentimes they trade a time and then make all these, you know? They wheel and deal and make all these moves. And then, if you look actually at the data, almost nobody beats just a buy-and-hold strategy over the long term. So there’s much more simple solutions to the investing problem where you can grow your wealth. And people who are overconfident, which is a lot, tend to over-manage their portfolios. Actually, do worse than just a simple sort of buy-and-hold strategy. It's very common. One question that's like that comes to mind, “Is this is an area where people have tons of feedback on their performance?” If you make a bad trade, you learn about it. If you make a good trade, you learn about it. You would think over time that, people would figure out and become more calibrated in terms of their abilities to beat the market and so on. But they don't. So we asked the question in this work. Why is that? Could it have something to do with people's memory of their past performance being biased? And that's exactly what we find, which is that people tend to remember their past performance as being better than it actually was. And that sort of supports this, feeling that they're better traders than they are.
[00:12:42] Amanda: Thank you. It's so interesting because you've got this memory bias, and then you also have this knowledge, illusion thinking that you know more than you do. Because those that are around, you know, information. You're in a group setting. Sometimes you're filling in these gaps, either with this information, that you think that you have or to present as if you already know. So it's this conglomeration of a variety of issues around the solution that we have the knowledge.
[00:13:09] Phil: I think you're right. The reason I got so interested in overconfidence as a phenomenon is because I think it's one of the most, if not the most important, psychological biases. And what you find when you dig in is that there’s actually several different reasons why people tend to be overconfident. One of those is the knowledge illusion. We tend just not to realize how complex the world is and how much is going on. And we tend to over-simplify, and so that's one path to overconfidence. Another path to overconfidence is the idea of confirmation bias. And I don't know if your listeners are familiar with the term, but confirmation bias is this is actually not one bias. Still, it’s a set of biases that basically all lead us to process information in a way that makes us confirm what we already believe or what we want to believe. If I think the Broncos are like the best team. People probably don't believe that anymore, but if they did, then you see some sort of ambiguous outcome in a game, and you say, “See, like that shows that they're the best.” And if you are a Patriots fan, you would look at that and say, “No, that shows that they're not good.” So this is what people do all the time. They engage in confirmation bias; they attend up and up. Any source, any piece of evidence, will lead them to just strengthen what they already believe. That's number two. And then, this third one, this memory bias that we seem to be hardwired. To have a kind of nostalgia where we remember the good stuff, and we forget the bad stuff, and that can also lead to overconfidence.
So, you put all these things, three things together, and you end up with a pretty powerful bias, where we tend to overestimate our abilities. Tend to overestimate our knowledge, and maybe, we'll talk about this in a little bit. But, it's sort of a double-edged sword, overconfidence. Because there are situations where it can be good to be overconfident, but it can also be, it can lead to huge problems. Overconfidence can decimate individuals. That can decimate society. You know, look at the financial crisis of 2008, a case where traders were totally overconfident. They felt like they understood these derivatives markets, and they really didn't understand at all. And they became so overconfident that nothing could ever go wrong, and look what happens. So that's the downside, but there can be positives too. I mean, there has to be a reason from an evolutionary perspective. There has to be a reason why, despite the dangers, people tend to being keep overconfident.
[00:15:29] Amanda: And that's a great point, Phil. Despite the danger, people are overconfident. And you've alluded to the fact that there are positives to overconfidence. Could you tell us a little bit more about what those positives might be?
[00:15:41] Phil: Sure. So I think of it as two things that you can benefit from by being overconfident. One of them is that you'll try something that's pretty risky. But might have a high payoff. And so entrepreneurs, for instance, are famously overconfident. And if you sat an entrepreneur down and you walk through every reason why their business might fail, maybe they would be deterred, and they actually wouldn't start a business because the probability of success is pretty low. Entrepreneurship is really hard. I mean, making a successful business is difficult. Most businesses fail. However, you can't succeed unless you try. Some people try, and they succeed and wow, do they do great. And so that’s one benefit is that it'll push you to try something that might have a high probability of failure but a big payoff. The second one is sort the social benefit. People react to confidence and when somebody acts confident, other people listen to them and follow them and promote them. And by the way, don't try to enter the same business as them. So there are all these different kinds of social benefits that you can get by being overconfident.
Now, of course, that comes with the danger because if you're miscalibrated, you’re gonna be surprised. A lot of those entrepreneurs who start businesses are surprised when things don't go as well as they assume, when the world is more complicated than they thought when something comes up that they didn't think about. And businesses sometimes, and, you know, by the way, not just businesses, but governments, also sometimes feel the sting of trusting people who are overconfident too much. So just look at Elizabeth Holmes. For instance, she convinced a lot of people to invest in a venture that was doomed to fail. And a lot of people got egg on their face. A lot of people lost a lot of money. And it was because she projected so much confidence for that. At least, that was one reason why she was so successful.
[00:17:28] Amanda: It's really interesting, Phil. We started to talk more about overconfidence, and you looping back to these three principles. You talked about oversimplification, the confirmation bias, and the memory bias. We addressed a couple of these. I'd like to explore the oversimplification a little bit more because I think you recently published some work on over-simplification, specifically as related to categorizing people into distinct groups even when the data's more continuous, and this really raises challenges for businesses. Could you tell us more about that?
[00:18:03] Phil: That was published in the Harvard Business Review a year or two years ago with former Leeds professor Bart de Langhe, who's actually moved on to teach in Barcelona. And it's called The Dangers of Categorical Thinking. And what we do in that paper is we talk about a bunch of these instances where people take a continuous situation, and they like to break things up into groups that they can understand. This is what the human mind is really built for: we're really good at categorizing. But some of the danger is that sometimes those categories aren't real things.
A good example would be a lot of personality tests, like the Myers-Briggs test, where people say, “Oh yeah, I'm an empathic introvert or whatever it is.” It turns out that those categorizations don't really have any validity to them at all. There’s not really separate groups of people that have the characteristics that fit into these groups. Everybody's different. And really, these quantities vary continuously. Yet what we do is we break people into groups and say, “You're like this, and you're like this.” And when we do, we talk about four dangers in the paper. The first is amplification means that we think that people who are in separate categories are more different from each other than, in fact, they are.
A good example would be if you say that someone's a Republican. Or that someone's a Democrat. The picture that people have in their mind of the prototypical Republican and Democrat is a lot more extreme in terms of distance from each other than is true in reality. There are certainly differences. There's real polarization. The perceived polarization is much bigger. That's amplification. Compression is the second one. Compression is the idea that we think of everybody who we put into a category as being the same or very similar. That is also not the case; within a category are very different from each other. A lot of times, the third one is discrimination. Once you put things into groups, now you can treat the groups. And the fourth one is what we call fossilization, which is the idea that we get this feeling that the world is a certain way because we think of it in a particular category of a structure. Then, we get stale because we often fail to recognize that the world is changing because we think things are just the way they are. This often happens with businesses in industries that are changing quickly. You have a categorization of, I don't know what it is, but like, these are our target customers.
This is what they're like, or this is the way that you do innovation in this space. This is what the product matrix looks like. And here's where we could add value. And so on. You have a radical change in an industry. Those categorical structures no longer fit. And you hold yourself. It's like electric vehicles—a lot of the OEMs. The traditional auto manufacturers take their gas cars and throw electric drive trains in the front hood. They do not realize that the way transport will happen in the next 20 years is a lot different. So, maybe they should be thinking about the world a little bit differently. But when categorical thinking forces us into these patterns, that can be hard to breathe.
[00:21:13] Amanda: So Phil, you just talk to us about the dangers of categorical thinking and the fork patterns that emerge in categorical thinking, which is a challenge for businesses. What can businesses do to combat or mitigate categorical thinking? And the four patterns therein?
[00:21:29] Phil: Sure. So in the paper, we gave four suggestions. The first one is simply to increase awareness. All are aware of this as being a problem, it, naturally, will be something that's on people's minds, and that might help. The second one is to develop capabilities, to analyze data correctly. So businesses more and more these days are dependent on their data and their data analytics and their data analysis capabilities. And so if you're doing a segmentation study for them, and you're used to the traditional thing, let's put people into groups and then target particular groups and so on. If you go and read the paper, you'll see that there are some problems with that sort of traditional approach because it is a kind of categorical thinking that doesn't always match reality. If you have the appropriate capabilities in-house to know how to properly analyze such data in more of a continuous way, that's definitely. The third one is what we call auditing decision criteria.
Sometimes in businesses, we engage in a kind of categorical, go-no-go decision-making style, where it's really a kind of categorical thinking. If X target is hit, you will go down the path. If it's not, go down path B. That's taking a continuous quantity, say whether we hit a target or don't. And the continuous quantity is now dichotomized and turned into a categorical decision. It's not really good to do that because you do not want to make a big business decision on the basis just by chance falling on one side or the other of some arbitrary criteria. And so, what you really want to do is find more continuous ways to mitigate the risk. So if you have really strong evidence, then you would want to take stronger action. And if you have weak evidence. You would want to take a weak action. So thinking about your decision-making in a more continuous way is a helpful thing. And the fourth thing, to sort a combat this idea of fossilization is to hold kind of brainstorming meetings. We call them de-fossilization meetings, where you really try to interrogate your deepest sort, most foundational beliefs about the way your industry works, about the customer segments in your industry, about the competitive landscape, and so on. And really try to interrogate those and ask yourself whether you're falling into this trap of having kind of a fossilized view of the world. So those are the four things that we recommend.
[00:23:55] Amanda: Great. Thank you. That's incredibly clear and really helpful as we think about it from a business perspective. Looping back to the first point you made there– awareness and thinking about that from an individual lens. How do we be more self-aware? How do we not fall into the trap? That is the knowledge illusion.
[00:24:16] Phil: Yeah, that's a great question. And if the answer is that, it's really hard. I mean, I study this stuff for a living, and I still fall for those Knowledge Illusion all the time. I find catch myself all the time going, “Oh yeah. And then wait a second.” It happens to me all the time. But what I have found is that you can train yourself to be better calibrated just by getting into the habit of asking yourself whether you understand something as deeply as you think you do. There's this concept that I like to talk about called intellectual humility. And intellectual humility, I think of it as being this midpoint between overconfidence and diffidence, and diffidence means being so unsure about anything that you're completely paralyzed, and you can't do anything. That's not. Diffidence is if you learn about this knowledge illusion and then you start thinking about how complicated the world is. You can actually pretty quickly get yourself into a pretty different place because of most of the decisions we have to make. We just do not have enough information. We don't understand them well enough, and so on. But oftentimes, we do need to make a decision. There's never enough information. And as individuals, we just can't hope to know everything about a particular topic. It's just too complicated. So this midpoint intellectual humility between diffidence and overconfidence is what I aspire to a lot of the time.
Like I said earlier, it's not always the best. Sometimes it's good to be a little overconfident because it gets you. It gives you a nudge to try something challenging or risky. But in general, calibration is a good thing. Calibrated investors don't end up getting themselves into big trouble. Calibrated politicians do not overpromise and underdeliver, and so on. I do think that it is possible to train yourself to habitually ask yourself, ask yourself the question, “Do I understand this as well as I think I do? And then, if the answer is no, the result is not necessarily to say, “Okay, I'm not going to take a position on this issue.” Okay. Is climate change real? Do I understand climate change? As well as I thought, probably not. I mean, it's a pretty complicated topic. Does that mean I should say, “Okay. Throw my hands up in the air and say, maybe it's not real?” No. However, what it does say that maybe you should back off the strength of your position a little bit? Maybe you shouldn't tell your cousin at Thanksgiving dinner that they’re a total idiot for believing anything different from what you believe? Because it turns out that you probably don't know what you're talking about either, you just happened to be, in this case, on the right side of it. I think that’s actually promotes more positive discourse. It leads to a lot more positive interactions and son on. So, I definitely think that's a good policy to take on board.
[00:26:40] Amanda: You had talked about being in a group and being afraid to ask questions because you want to appear as if you understand what is being talked about. And maybe you think that you do. Is there something that you can do in those situations to calibrate yourself at the moment and decide to ask that question?
[00:26:59] Phil: Yeah. I wrote this article a couple of years ago, and it's called, We should be asking more stupid questions. The truth is that a lot of the times, a lot of people in the room if you don't understand something, it's not making sense to you. A lot of the people in the room feel the same way, but nobody asks the question because they don't want to look like they don't know what they're talking about. So, the quick answer is, if you find yourself in that situation, ask. The longer answer is that in order to do that, you really need a good organizational culture that promotes that kind of questioning. Because if you don't have that kind of organizational culture, you can suffer penalties from being that person, even if it benefits the firm if it benefits the company. There are sort of two sides to that. One is, we should be asking more stupid questions, and that's on individuals to some extent. But it's also on the leaders’ and organizational cultures to foster where it's okay not to know everything because nobody ever does. Otherwise, we’re kind of living a lie. You know, going through pretending everyone knows what they're talking about. When it's just not the nature of human beings to know what they're talking about a lot of the time, Most of us know a little bit about a little bit, and that's okay because that's who we are as people.
[00:28:11] Amanda: And in your research, did you learn anything about what businesses can do to create an environment where you know employees feel comfortable speaking up and asking these stupid questions, as you've called it through research?
[00:28:25] Phil: The evidence has been accruing for a while that organizational cultures that are not so hierarchical, and our focus on leaders who try to build teams that are greater than the sum of their parts. And leader who people some autonomy to really succeed and all that kind of stuff. And leaders who are aware of their own limitations, those types of cultures, in the end, end up leading to better outcomes, both for the employees and for the business.
[00:28:55] Amanda: Thank you, Phil.
[00:29:12] Phil: I've enjoyed speaking with you. Thanks so much for having me.
[00:29:14] Amanda: Thank you again for listening to Leeds Business Insights, and a special thank you to my guest, Phil Fernbach. Make sure you don't miss a single episode. Subscribe to Leeds Business Insights wherever you get your podcasts. See you next time!
The Leeds Business Insights podcast offers a cutting-edge perspective of trending topics, along with actionable insights from faculty, alumni and global leaders, to help you navigate the evolving world of business.
We invite you to subscribe wherever you get your podcasts, read about our latest episodes or listen to the most recent podcast below.