David McRaney is a science journalist, author, and host of the You Are Not So Smart podcast. Below, he shares 5 key insights from his new book, How Minds Change: The Surprising Science of Belief, Opinion, and Persuasion. Listen to the audio version—read by David himself—in the Next Big Idea App.
1. Persuasion is not coercion.
Persuasion is not coercion. It is also not an attempt to defeat your intellectual opponent with facts or moral superiority, nor is it a debate with a winner or a loser. Persuasion is leading a person along in stages, helping them to better understand their own thinking and how it could align with the message at hand. You can’t persuade another person to change their mind if that person doesn’t want to do so, and the techniques that work the best focus on a person’s motivations more than their conclusions. In many ways, persuasion is mostly encouraging people to realize that change is possible. Ultimately, all persuasion is self-persuasion. People change (or refuse to do so) based on their desires, motivations, and internal counterarguing. By focusing on these factors, an argument becomes more likely to change minds.
2. Certainty is an emotion.
Beliefs aren’t ideas stored in your brain, possessions on a shelf, or files in a biological computer. Belief is a process. To believe or doubt is the result of neurons in associative networks delivering an emergent sensation of certainty, or lack thereof. The speed of change is inversely proportional to the strength of our certainty, and certainty is a feeling: somewhere between an emotion and a mood, more akin to hunger than to logic. Persuasion, no matter the source, is a force that affects that feeling.
“Ultimately, all persuasion is self-persuasion.”
3. Social death is greater than physical death.
Brains resist change to some degree because updating when you shouldn’t is dangerous (you might become wrong). But since not updating when you should is also dangerous (you might stay wrong), the brain walks a tightrope, changing its mind carefully given a variety of motivations and goals.
The strongest motivation to resist change is the fear of shame and ostracism. As social primates, humans value being good members of their groups much more than they value being right—factually, morally, or otherwise—so much so that as long as we have a group that satisfies our needs, we will choose to be wrong if it keeps us in good standing with our peers. As the sociologist Brooke Harrington puts it, if there was an E = mc2 of social science, it would be SD > PD, “social death is more frightening than physical death.” This is why we feel deeply threatened when a new idea challenges the ones that have become part of our identity.
For the ideas that identify us as members of a group, we want to seem trustworthy, and reputation management as a trustworthy individual often supersedes most other concerns, including our mortality.
“The brain walks a tightrope, changing its mind carefully given a variety of motivations and goals.”
4. Disambiguation leads to naïve realism.
Disambiguation is what brains do when confronted with novelty and uncertainty. We use what we think we know to disambiguate the ambiguous. I love that term, especially because it comes from reading comprehension: the act of deriving meaning through context when a word, phrase, or entire essay could be interpreted in many ways.
There’s a term in psychology that pairs well with disambiguation: “naïve realism.” This is the certainty one feels when blind to the fact that you are disambiguating, meaning your interpretation doesn’t feel like an interpretation. Since subjectivity often feels like objectivity, naïve realism makes it seem as though the best way to change people’s minds is to show them the facts that support your view, because anyone else who has read the things you have read or seen the things you have seen will naturally see things your way, given that they’ve pondered the matter as thoughtfully as you have. Therefore, you assume that anyone who disagrees with your conclusions probably just doesn’t have all the facts. If they did, they’d already be seeing the world as you do.
This is why you continue to ineffectually copy and paste links from all of your most trusted sources when arguing points with those who seem misguided, crazy, uninformed, or plain wrong. The problem is that this is exactly the same approach that the other side thinks will work on you.
“Since subjectivity often feels like objectivity, naïve realism makes it seem as though the best way to change people’s minds is to show them the facts that support your view.”
5. No one is unreachable.
Imagine attempting to reach the Moon with a ladder and, upon failing, giving up in frustration because now you believe the Moon is unreachable. When we use the wrong tools and approaches, the people on the other side of the issues we care about can seem impossible to reach. That’s why I used to avoid arguing about politics, superstitions, or conspiracy theories, but I changed my mind about how minds change (and how to change them) after meeting the experts and activists who not only showed me a better way, but also explained the science behind why it works.
Even among the people who seem furthest away from what you consider the ground truth, change is a moonshot away once you understand the nature of resistance and the proper techniques to avoid it. The ability to change our minds, update our assumptions, and entertain other points of view is one of our greatest strengths. It is an evolved ability that comes free with every copy of the human brain. To leverage that strength, we must avoid debate and start by having conversations.
Debates have winners and losers, and no one wants to be a loser. But if both sides feel safe to explore their reasoning, to think about their own thinking, to explore their motivations, then we can each avoid the dead-end goal of winning an argument. Instead, we can pursue the shared goal of learning the truth.
To listen to the audio version read by author David McRaney, download the Next Big Idea App today: