Meeting Daniel Kahneman in real life is the psych-nerd equivalent of hanging out with Bob Dylan. Both have recently been awarded Nobel Prizes. Both reformed their fields. While Dylan bulldozed folk and reinvented rock with an electric guitar, Kahneman and his longtime collaborator Amos Tversky employed clever study designs to reveal how misled by intuitions and mental shortcuts — which they termed heuristics — and how reliably irrational humans are. As Michael Lewis details in his new book on their collaboration, The Undoing Project, the duo’s research upended foundational assumptions not just in psychology, but economics, medicine, professional sports, and beyond.
This week, I had the privilege of nabbing a half-hour of conversation with Kahneman before he went onstage at a private dinner in Manhattan. One burning question I had is why the mind so constantly cruises to the automatic, unconscious mental shortcuts that he detailed in Thinking, Fast and Slow. To the 82-year-old Princeton psychologist, people think like they see. By recognizing this, you might be a little less likely to fixate on first impressions or fall victim to confirmation bias.
“In visual perception, you have a process that suppresses ambiguity,” Kahneman tells Science of Us, “so that a single interpretation is chosen, and you’re not aware of the ambiguity.”
Consider the form at the center of the below optical illusion.
Depending on context — whether you’re “reading” the image horizontally or vertically, the characters snap into place as the letter B or the number 13. In perception, this process isn’t mysterious, Kahneman says. One interpretation is exchanged for another, with the alternate reading being suppressed. It’s palpable to you, the viewer. The analogous, and less obvious, pattern happens all the time in our habits of mind.
“When we reach interpretations, many of the characteristics of visual perception are retained, like a search for coherence, things that make sense together,” he said. “You’re very likely to perceive things that aren’t there in perception. All of us do that.”
Take, for instance, the notorious stickiness of first impressions. “There is some accuracy with a first impression, but if you’re going to be with a person for a long time, I’m not sure those early impressions are very useful to you,” he says. (Consider that the next time you want to ghost someone after one date.) Because of this, he says, first impressions tend to be self-fulfilling: If you take someone to be hostile toward you, you’ll act hostile toward them, prompting their hostility — and you thinking you had the impression right the whole time.
Indeed, people can’t help but impose the letter B or the number 13 on the messiness of life. “Where does confirmation bias come from? Confirmation bias comes from when you have an interpretation, and you adopt it, and then, top down, you force everything to fit that interpretation,” Kahneman says. “That’s a process that we know occurs in perception that resolves ambiguity, and it’s highly plausible that a similar process occurs in thinking.” Which is precisely why you — or a president — shouldn’t trust everything you think. Unfortunately, the more powerful you are, the more you believe your own thoughts.