Thinking, Fast and Slow
Magazine / Thinking, Fast and Slow

Thinking, Fast and Slow

Book Bites Politics & Economics Psychology
Thinking, Fast and Slow

Our society is built on the premise that human beings are more or less rational. We trust that our leaders, judges, scientists, and other experts are making fair and unbiased decisions, and that we ourselves are seeing the world as it is and making the best choices we can.

If only that were true! In fact, human thinking is riddled with biases, oversimplifications, and distortions. And these mistakes aren’t random, either. In Thinking, Fast and Slow, Nobel Prize-winning economist and psychologist Daniel Kahneman attempts to uncover the patterns of error in human judgement, and suggests ways we can fine-tune our thinking to make better decisions and see the world more clearly.

Thinking Fast And Slow

Read on for five key insights from Thinking, Fast and Slow. To listen to the audio version of this Book Bite, download the Next Big Idea App today.

1. You are always of two minds.

You don’t have one mind. When it comes to thinking, Daniel Kahneman argues, you have two. While they go by many names, Kahneman prefers the terms System 1 and System 2:

  • System 1 thinks quickly and automatically, with little to no effort and no sense of voluntary control.
  • System 2 operates slowly, calculating and reasoning. It’s what we think of as our conscious, deliberate mind.

System 1 is what lets you immediately recognize that one person is farther away than another, for example, or lets you solve 2 + 2 without having to think about it. System 2 is what lets you calculate just how far away that person is, or helps you solve a harder math problem like 16 x 43.

“By better understanding our two-system mind and how it operates, we can avoid mistakes in reasoning and make better decisions.”

This division of labor is highly efficient. For a mind that is constantly being bombarded with stimuli, it keeps effort to a minimum. System 1 can take care of a lot of basic tasks without any input from System 2. It works most of the time because System 1, despite being automatic and intuitive, is pretty good at modeling familiar situations and reacting to challenges.

But System 1 can make mistakes. It’s prone to biases, cognitive illusions, and overconfidence. We often fail to recognize these errors because they happen so quickly, before our conscious mind has even kicked into gear. By better understanding our two-system mind and how it operates, we can avoid mistakes in reasoning and make better decisions.

2. You can’t expect people to behave rationally.

Much of economic theory—and a lot of everyday behavior—relies on the premise that human beings are fundamentally rational actors. The assumption that most people are behaving in their own self-interest is baked into how we think about the world and how leaders make policy.

But drawing on numerous research studies, Kahneman demonstrates that our two-system mind leaves us prey to a wide array of biases and fallacies, cognitive quirks that reliably steer us away from rational choices.

For example, when taking on risky projects or investments, decision-makers often fall victim to what Kahneman calls the “planning fallacy.” This is our tendency to make optimistic predictions based on what our fast-acting System 1 mind hopes will happen, rather than a careful analysis of what is likely to happen based on statistics or experience, as analyzed by System 2. This fallacy impacts both corporate and individual behavior, as we reliably overestimate benefits and underestimate costs.

“Our two-system mind leaves us prey to a wide array of biases and fallacies, cognitive quirks that reliably steer us away from rational choices.”

The planning fallacy is just one of dozens of mental distortions that Kahneman and his fellow researchers have identified, a devastating critique of the rational choice theory favored by many economists. By learning about these cognitive biases, we’re more likely to avoid their false conclusions and see the world more accurately.

3. Experience and memory are in conflict.

You may think that your experiences (“I’m having fun!”) and your memories (“That was fun!”) are closely connected. But Kahneman points out that they can diverge drastically, and that has implications for the choices you make. For example, buying an expensive luxury item may give you a lot of actual joy in the moment. But with hindsight, the cost of the purchase may weigh on you, and you may decide it wasn’t worth it. So the next time you have a purchase to make, do you optimize for the in-the-moment experience, or for the memory of that experience, which is what you’ll retain over the long term? It’s not always easy to decide, because your two selves have different priorities.

The “experiencing self” is rooted in System 1, which favors short periods of intense pleasure and dreads sharp pains. The System 2 “remembering self” tells a story about that experience, and that story changes over time. For example, patients undergoing brief but painful medical procedures report suffering more than patients who had longer procedures where the pain was initially just as great but lessened over time. Even though the second set of patients experienced more total pain, the way the pain was distributed over time made them remember the procedure more favorably.

The conflict between the remembering and experiencing selves can keep philosophers awake at night. Which one is the “real you”? In practice, knowing whether to optimize for the experience or for the memory can be difficult, but the best outcomes are likely when we learn to keep both selves in mind.

“The conflict between the remembering and experiencing selves can keep philosophers awake at night.”

4. Losses loom larger than gains.

First published by Kahneman and his longtime collaborator Amos Tversky in 1979, prospect theory is a model of decision-making behavior informed by Systems 1 and 2. It was cited in the 2002 decision to award Kahneman the Nobel Prize in Economics.

A key aspect of the theory is loss aversion. This is the observation that the pain people experience from losing $50 is much more intense than the joy they experience from earning $50. In other words, we’ll do more to avoid a loss than we will to achieve a gain.

The real-world implications of this theory can be seen in how people behave when buying or selling stocks or real estate. Kahneman points to a study of the condominium market in Boston during a downturn. Sellers of nearly-identical condos set very different prices based on what they had originally paid for their homes. Even though the original buying price shouldn’t matter to the current selling price (only the quality of the condo and present market conditions should matter), the owners who had paid more were determined to avoid any perceived loss, so they spent more time trying to sell their home at a higher price.

For those condominium sellers, their original purchase price was a reference point that impacted what they thought they should sell for, regardless of market realities. “Reference dependence,” as Kahneman called it, is our tendency to perceive the value of things and experiences relative to some status quo that we hold in mind—whether it’s our memory of how much a thing used to cost, or our neighbor’s fancy car that influences what kind of car we think we “should” have.

Reference points and loss aversion help explain a lot of the choices we make, including many irrational ones. Knowing about these tendencies can help us choose more rationally, and spend and invest more wisely.

“Reference points and loss aversion help explain a lot of the choices we make, including many irrational ones.”

5. To think smart, think slow.

As we’ve seen, our two-system mind is subject to all sorts of cognitive errors. So what can we do about it? How can we more reliably make good decisions? Kahneman offers several strategies.

First, recognize when your automatic, System 1 thinking is leading you astray, then slow down and get reinforcement from System 2. While this can be very hard to do in the moment, knowing about the existence of cognitive biases and fallacies makes it more likely you’ll spot them. It’s also a lot easier to recognize when others are wandering into a cognitive minefield. Giving and getting feedback from colleagues and other observers makes avoiding errors more likely.

You can also put structures in place around you that make decision-making more deliberative. For example, Kahneman recommends leaders adopt checklists, which help avoid oversights and encourage a culture of slow thinking.

Broadly speaking, there are three stages to making a decision:

  • Framing the problem
  • Collecting the relevant data
  • Reflecting on and reviewing the information

According to Kahneman, paying careful attention to each stage of the process will result in improved decision-making. Whether it’s planning corporate strategy or making an individual choice, we think best when we think slower, when both System 1 and System 2 are allowed to do their jobs.

To listen to the audio version of this Book Bite and many more, download the Next Big Idea App today:

Listen to key insights in the next big idea app

Download
the Next Big Idea App

app-store play-market

Also in Magazine

-->