How to Be Better at Being Wrong
You probably don't have to think very hard to remember the last time you did something dumb – maybe it was investing your life savings in Blackberry, or playing pickup basketball while out of shape and rupturing your Achilles. Well, you probably don't actually regret the decision to do those things at all. You regret the outcome: not getting the yacht you'd imagined would carry you and your fat bank account to the Mediterranean; not having an Achilles. What happens is not a reflection of the quality of the choice we made, because the outcome is out of our control. That is both liberating and terrifying: you can't blame yourself, but only because so much of our life is decided by dumb luck and information we don't have.
Decision strategist Annie Duke plumbs this idea in her book, Thinking in Bets: Making Smarter Decisions When You Don't Have All the Facts. Early on, she was introduced to the ideas of decision strategy thanks to card games with her dad and brother – and, while watching her mother fight a battle with her health, she became acutely "aware of the uncertainty of things." She says she had a choice: "Either be completely upset by it, or learn to embrace it and figure out how to go forward." So she double majored in English and psychology, before getting a National Science Foundation Fellowship to study cognitive linguistics at Penn. But, before finishing, she ended up in Montana, playing cards – a detour that turned into a nearly two-decade career. Talk about embracing uncertainty.
Using insights she learned – and highlighting them with culturally relevant examples (like the Seahawks fateful goal line interception in the Super Bowl, and Nate Silver's 2016 election projections) – Duke embarks on a fascinating exploration of the ways in which faulty wiring has made humans exceptionally fallible: we are incapable of objectively evaluating our decisions; we do very poorly with probabilities, preferring to deal only in black and white; we care too much about being "right" and "wrong"; we're susceptible to fake news (evolutionarily, it was better to have heard a rustle and thought, "Lion! Run!" than to have heard a rustle and thought, "Hm. Is that a lion? Let's collect some facts."); and we too often confuse agency and luck.
In conversation here, Duke highlights her insights and explains that by better understanding our implicit biases and finding comfort in uncertainty, we just might become more self-assured, compassionate, and carefree moving through an unsure world.
GQ: Why are we so frightened by the idea of uncertainty?
Annie Duke: We seek connections between things. We want things to be causal. We look for patterns. From an evolutionary standpoint, you need to be able to. It's important, for example, for us to be able to recognise our own mother's faces. We're all certainty seekers, and it's because the way that we're wired.
That being said, the more you can get comfortable with uncertainty, the better off you are. It's a more accurate representation of the world. You're able to be calmer, and more compassionate, towards other people and towards yourself. Things aren't always going to work out. You can make all the best decisions in the world, and it can go awry. You can make the worst decision ever, and it can go just fine.
To that point: why is it so hard for us to separate outcome from the decision?
It's called resulting. Resulting is thinking that the quality of an outcome is a really good signal for the quality of the decision. What does that mean? If something doesn't work out, it was a bad decision. If it does work out, it was a good decision.
And you have that great example of Pete Carroll’s Super Bowl blunder: instead of running the ball from the one-yard line to win the Super Bowl, he calls a pass play, it gets intercepted, and the Seahawks lose a game they probably should’ve won.
The headlines were completely brutal. It was “the worst play in Superbowl history.” There's a lot of very good thought behind making that decision. You could go through all of that analysis, but you could also do this really simple thought experiment: what if he had called for the pass play and the ball got caught and [they] won the Superbowl? What would the headlines have looked like?
How that ended up turning out that one time shouldn't actually have any effect on whether the decision is good or not. The decision is good in absence of whether it gets caught or whether it doesn't. The headlines shouldn't actually change [based on the outcome], but they actually swing wildly. This is the problem of resulting: what happens is that now we take this bad outcome, we think it’s a signal for the decision quality, and then we're going to actually change the way that we make decisions in the future, based on this one outcome. There's too much luck in life to do that.
You also get into Nate Silver’s election projection, and how humans have an inability to think in percentages. The polls said Trump had a 30 percent chance of winning, and people are like, "Oh, all the polls are wrong." It's like, "Well no, 30 percent is actually not zero percent."
It's very, very, very, very far away from zero. If I said to you, "Here's a gun and it's got nine chambers, and there's three bullets in it. Do you care to play?" I'm just guessing your answer's no. I think we really want to feel like we have agency, that we have control. That's a problem.
What I thought was really interesting was how incredibly vilified Nate Silver got in that, when Nate Silver actually was very reasonable. He was saying, "35 percent of the time, Trump is going to win." Again, if I gave you a gun with a 100 chambers in it and 35 of them are filled, are you willing to shoot yourself in the head? What happened was that [in] the translation from 35 percent to the punditry, [the percentage] gets lost. They just round up. They go from 65 percent to: Hillary Clinton's going to win. In order to be believable communicators, in order to come off across as confident, as knowledgeable, as having our opinion worth hearing, we need to express things with certainty. [A pundit is] not going to say, "Well Anderson, let's not get ahead of ourselves. We know that if we run this election 100 times, 35 of those 100 times Trump's going to win. 65 of those times Clinton's going to win." Anderson's going to go, "We don't want you on our show anymore."
Now, notice all of the downstream effects. Everybody was super shocked. What did you start hearing? "You were wrong." They weren't saying, "You pundits were wrong." What were they saying? "You pollsters were wrong." Then that created the downstream problem of now people just generally dismissing pollsters. That's saying, "You data scientists, you have no idea what you're talking about. Polls are wrong." The poll said Clinton was going to win the popular vote by around two percent. Last time I checked, that was pretty right.
Generally, in our lives, we don't have enough outcomes to really make any sense of the decision quality, based on the one outcome alone. We're not normally flipping a coin 10,000 times. We're normally flipping it a couple of times, and trying to figure out something of value. We're trying to learn all of this stuff from Trump's election, and it was one result. I don't think we know very much from it.
Does it feel to you like we're more averse to being wrong than we ever have been before?
I don't know that that is true. I think it feels that way because it's very loud. In the past, it felt like we were better at being “wrong” or at least moderated, because if you think about the way news was delivered, there were three channels, trying to capture the largest audience possible. Now, channels are very tailored. I think that you're getting your point of view spewed back at you a lot more, so I think there's less exposure. That's problematic.
[But] this has been a problem for the whole human history: our inability to change minds, to not be open-minded, to believe you’re right, and to not listen to dissenting voices. There's all sorts of points in human history where you can see these kinds of breakdowns occurring, right? I think it feels like it's worse now, but I don't really think it is worse. I think this is generally a problem with human cognition.
Okay, then what are some ways that we can get better at being wrong?
Number one is to stop thinking about things as right and wrong, and black and white. Most of the time it turns out that we're a little bit less right than we thought. I imagine that if I pin you down on any opinion.You would realise that whatever that belief that you have, sort of by definition, can't be 100 percent.
That means that you weren't really right in the first place. You were somewhere in between. Now, when somebody gives you evidence that could cause you to calibrate your belief, if you aren't thinking about yourself as right in the first place, it allows you to be very open-minded to information that might generally disagree with something you believed. If you're walking around going, "Hey, I'm 65 percent," and now you tell me something different, I'm like, "Hmm, maybe now I'm 72 percent on it because you just strengthened my belief with your evidence." The world isn't divided into right and wrong.
You’ve done a lot of work on human behavior and cognition outside of this book, and if you could pass the one or two most important insights of all the things you've read or learned, I’m curious what those might be.
The idea of cutting your losses. You see this in people waiting in lines all the time. You're in the grocery store and you're Einstein with the lines, right? You're calculating out how many people are in the lines. How much stuff do they have in their cart? Do they have coupons? How fast does the cashier look? Once you get in the line, you don't usually switch. Even if it turns out that your line is much slower, where maybe you should switch. You don't treat it as a new decision. You stay in the line. People feel like if they abandon it, they will have wasted the time. Except the time is already gone. That's what they don't understand. You can't get the time back, it's in the past. You don't have a time machine.
[Another thing] I figured out really fast with cards is that [losing players] had this tendency to say they got unlucky. When players were winning, it was, "I'm such a great player. I'm such a genius." They were offloading the bad results to luck. In cognitive psychology, it's called self-serving bias. You can see why, because it serves yourself to field the outcomes that way. That pattern is really, really devastating to learning. If all the bad results aren't your fault, if they're due to luck, there's nothing to learn from them. You won't go back and examine if there was something that you could have done to increase the probability of a good result. If all the good results are because of skill, then you're reinforcing the decision-making that led to that result, and that decision-making might not have actually been very good. Outcomes and decision quality are only very loosely related. We don't ever want to create this sort of one-to-one relationship between them.
Self-serving bias colours all of the ways you hear people talking to each other. [It could be] about relationship partners: "I can't believe they were such a jerk to me." Or somebody gets a promotion and it's like, "I can't believe I didn't get the promotion. It's just that the boss doesn't like me and they liked that other person," or, "The other person schmoozed the boss better than I did." These are all ways in which we're sort of like taking these things that happened and making them happen to us.
You talk about how it feels really bad to us, like an attack on our own self-image, to admit that we're wrong. That’s one reason that we're averse to being wrong, and we sort of need to give that up. At the same time, that means giving up the good feeling when you're right. But being right about something is a visceral feeling. It's emotional.
You talk a lot in the book about how we can't work outside the hard-wiring of our brain, so how do you walk it back from that good feeling, or when you're wrong, how do you walk it back from that bad feeling?
I think it depends on how you define winning in the game. How are we defining what it means to be right or wrong? If we compete on something, who's going to win: the person who's right in the sense of, "I'm just affirming all the beliefs that I have," or the person who's developed the most accurate view of the world? It's going to be the person who's developed the most accurate view of the world.
That means that you must be open-minded to information that disagrees with you. You have to be seeking that out, because if you challenge me, I better go and figure out, "What do you know that I don't know? What information do you have? What am I missing? What could I go read that's going to help me figure out whether this is a good bet or not?" It's going to immediately put me into a more open-minded stance toward information.
This interview has been edited and condensed.