Foolproof: Why Misinformation Infects our Minds and How to Build Immunity
Magazine / Foolproof: Why Misinformation Infects our Minds and How to Build Immunity

Foolproof: Why Misinformation Infects our Minds and How to Build Immunity

Book Bites Politics & Economics Psychology
Foolproof: Why Misinformation Infects our Minds and How to Build Immunity

Sander van der Linden is a Professor of Social Psychology and director of the Cambridge Social Decision-Making Lab at the University of Cambridge. He also leads national consensus reports on the psychology of misinformation and serves on the World Health Organization’s (WHO) infodemic working group.

Below, Sander shares five key insights from his new book, Foolproof: Why Misinformation Infects our Minds and How to Build Immunity. Listen to the audio version—read by Sander himself—in the Next Big Idea App.

Foolproof: Why Misinformation Infects our Minds and How to Build Immunity By Sander van der Linden Next Big Idea Club

1. Misinformation spreads like a virus.

Let’s talk about Anthony. Anthony is a young man who had never really shown any interest in politics. He lives in the suburbs of Chicago and used to sell solar panels. At some point, during the pandemic, he became consumed with conspiracy theories about the election. So much so that on January 6th, 2022 he decided to storm the Capitol wearing a bulletproof vest. After his arrest, Anthony apologized and said he had fallen victim to a contagious conspiracy theory. Referring to the torrent of misinformation that had temporarily engulfed his client, Anthony’s lawyer stated, “You can catch this disease.”

This isn’t just a loose analogy. Just like viruses take over their host cell in order to spread and replicate, a process called molecular hijacking, research shows that misinformation can deeply infiltrate our consciousness. It’s not always immediate. Just as viruses’ incubation period can range from a few days to years before people start showing symptoms, bad information can fester in our minds for long periods of time. Misinformation can exploit the brain’s cognitive biases, influencing our beliefs and behavior. It can even alter our memories of events that possibly never happened.

The ultimate goal of the misinformation virus is to replicate itself: to have people spread false information to others. Here’s where things get really interesting: we can use models from epidemiology, normally used to study the spread of biological viruses, to study how information pathogens spread in social networks. In this kind of network, people are the nodes with links to others. Exposure to a misinformation pathogen can activate and infect an individual who can then readily spread it to others. We can even calculate the basic reproduction number or the R0 value, that is, the average number of people who will start posting fake news following contact with an infectious individual.

In 2021, the World Health Organization (WHO) declared a worldwide “infodemic” to emphasize the striking parallels between how viruses and misinformation spread. Take, for example, the fact that people might not be intentionally spreading false information, a lot of us are asymptomatic, that is, we might be forwarding misinformation without even realizing it!

2. Misinformation can kill people—and democracy.

Just as some viruses can ultimately kill people, misinformation has had deadly consequences too. In India, the spread of false rumors on WhatsApp has led to hundreds of violent mob lynchings. In Iraq, people, including children, have been hospitalized because they ingested toxic products that were falsely advertised on social media as a cure for the coronavirus. A recent report calculated that misinformation cost the Canadian healthcare system at least $300 million with thousands of preventable deaths. People have died from COVID-19 because they refused treatment based on false conspiracy theories. A man in Seattle shoved a sword through his own brother’s head because he thought he was a shapeshifting lizard. Over 50 phone masts have been set ablaze in the UK alone because of false conspiracy theories linking 5G towers to coronavirus “hotspots”. Moreover, the consequences of misinformation can be long-term. In the early 2000s, vaccination coverage in both the UK and later the US plummeted following false claims that the MMR vaccine can somehow cause autism. It took many years for vaccine uptake to recover.

“You only need to dupe a minority of people in order to potentially undermine democracy.”

Misinformation isn’t just a direct threat to people’s health and well-being, some of its consequences are more pernicious and indirect. Ultimately, in order for misinformation to spread, it requires a susceptible host. As such, misinformation can do the most damage when it taps into existing prejudices, conflict, growing societal tensions, and political polarization. Research shows that exposure to misinformation can lead people to lose trust in each other, institutions, the mainstream media, and the electoral process itself.

One of the biggest misconceptions is that in order for misinformation to be influential, it needs to impact large groups of people. That’s simply not true. With the assistance of dark posts, micro-targeting, and other manipulation techniques, actors can synthesize and tailor misinformation to just those people who are deemed most susceptible to persuasion. Many elections are decided on increasingly narrow margins. In short, you only need to dupe a minority of people in order to potentially undermine democracy.

3. Pre-bunk or inoculate against misinformation.

If we continue to follow the viral analogy, we inevitably arrive at a solution: the potential for a vaccine against misinformation. This idea of a psychological vaccine against fake news is something the Social Decision-Making Lab at the University of Cambridge has been working on for many years now.

However, before we get to the power of pre-bunking, it’s useful to understand some of the limitations of traditional approaches, like debunking and fact-checking. Although fact-checking is important, you can float misinformation within a split second, but it takes days if not weeks to craft a good fact-check. It’s like a game of Whack-A-Mole; when you debunk one falsehood, another one pops up in its place.

Moreover, even if you are successful, misinformation lingers in the brain. What if I told you that earlier this week, I got terrible food poisoning from a place down your street so you should probably avoid it. Then, a week later I tell you that I was mistaken, it was actually some other place. Regardless of this correction, every time you now pass that restaurant, you’re going to think of food poisoning. That’s what we call the “continued influence of misinformation.” Jonathan Swift captured it well when he said “falsehood flies, and truth comes limping after it, so that when men come to be undeceived, it is too late; the jest is over, and the tale hath had its effect.” In other words, you can’t un-ring a bell.

For all of these reasons, it is easier to pre-bunk misinformation. This approach is all about building resilience to falsehoods in advance and follows the vaccination analogy exactly. Just as introducing a weakened or inactivated strain of a virus into the body triggers the production of antibodies to help fight off future infection, it turns out you can do the same with misinformation. By pre-emptively refuting and exposing people to a weakened dose of misinformation, or the techniques used to spread misinformation, people can build up cognitive or mental immunity. Just as with regular vaccines, the more examples your body has of a potential invader, the better it can mount an immune response. Our research shows that it works the same way with the human mind. The more we can prepare people with weakened doses of the tactics used to mislead people, the better people become at identifying and neutralizing manipulation. So how do you achieve this in practice?

“One such technique is called a ‘false dilemma,’ a technique that leads people to think there are only two options, while in fact, there are more.”

Well, you leverage Star Wars. In one pre-bunking video we forewarned people that manipulators use predictable techniques to dupe people. One such technique is called a “false dilemma,” a technique that leads people to think there are only two options, while in fact, there are more. It’s heavily deployed in politicized debates, by so-called YouTube “gurus”, and in misleading news headlines. For example, consider the claims “either you join ISIS or you’re not a good Muslim” or “we either need to improve our education system or deal with crime on the streets.” Back to Star Wars. In the experiment, we exposed people to an inactivated strain of the virus: an episode from Revenge of the Sith where Obi-Wan confronts Anakin Skywalker who says, “Either you are with me…or you are my enemy!” Obi-Wan replies, “Only a Sith deals in absolutes.” We then tested people with a range of misleading headlines that make use of this tactic and found that people could better identify and neutralize attempts to mislead them. The pre-bunk acts like a broad-spectrum vaccine. In short, prevention is better than cure!

4. Psychological herd immunity.

The idea behind a “cognitive vaccine” is powerful insofar as we can prepare people individually against impending misinformation. When you follow the viral analogy to its logical conclusion, however, you realize that the job is not yet done. Only when enough people in the population have attained a sufficient level of immunity will misinformation no longer have a chance to spread.

Based on data from our experiments, computer simulations showed that, in theory, herd immunity is not out of the question. But in order to scale the vaccine, we needed two innovations. The first is that you can’t be pre-bunking every single falsehood; instead, it’s much more efficient to focus on the underlying techniques that are used to spread misinformation. Luckily these are predictable. In the 1800s, people were fear-mongering that the cowpox vaccine (used against smallpox) would turn you into a human-cow hybrid. Now the COVID-19 vaccines are supposedly changing your DNA. It’s the same trope just 200 years apart. The advantage here is that you don’t have to pre-bunk every conspiracy theory anew; once people know the underlying trick, they’re less likely to be fooled.

The second innovation is that we need a “virtual needle” to scale the approach. In one massive field experiment during the pandemic, we teamed up with the UK government, the WHO, and the United Nations to pre-bunk harmful misinformation about COVI9-19. Across all social media channels, the campaign was able to reach over 200 million views. In another experiment we conducted together with Google, the first of its kind, we were able to run pre-bunking videos live on YouTube in the ad space, reaching millions of users. In theory, YouTube could use the ad space not for profit but to help pre-bunk potentially harmful misinformation by inserting and scaling our videos worldwide. Twitter conducted a massive pre-bunking campaign during the recent U.S. presidential election, where they warned millions of US users in advance that they might come across misinformation about voting by mail on the platform, but that they should rest assured that election experts agree that voting by mail is safe. Results indicated that the pre-bunking messages reinforced trust in the election process and led people to seek out more reliable information.

“The greatest potential for inoculation to spread is from person to person: spreading to our own family, friends, and neighbors.”

Although these experiments show the approach is scalable on social media, the key thing is not that inoculation should come from the top down. The approach is meant to empower anyone to pre-bunk falsehoods. The greatest potential for inoculation to spread is from person to person: spreading to our own family, friends, and neighbors. We can all pre-bunk conspiracy theories and misinformation with the power of a simple conversation. That’s how we’ll build true herd immunity against misinformation.

5. Make it fun and don’t tell people what to believe.

One of the things that the Cambridge Social Decision-Making Lab is known for is a sense of humor. One of the first interventions we created was Bad News, a completely over-the-top social media simulation we designed with a gaming company that makes the idea behind pre-bunking fun and entertaining. It’s not a boring media literacy lecture. It’s radical and meant to stimulate people’s defenses. Bad News simulates Twitter and allows people to temporarily step into the shoes of an online manipulator.

In the game, players are forewarned about the dangers of misinformation but also actively encouraged to make use of weakened doses of common misinformation techniques as part of their journey to becoming a fake news tycoon. We call this “active inoculation,” and players are invited to generate their own antibodies. We spent years studying how professionals dupe people, from impersonating experts and concocting conspiracy theories to creating polarizing headlines and trolling people with a bot army. The idea is to build psychological immunity by actively helping players identify how they might be attacked with misinformation in the future. It’s a bit like the first time you see a new magic trick. You’re duped and want to know how it works, right? To find the answer, you could either look at a complicated blueprint and engage analytically (akin to a fact-check) or step into the shoes of a magician. The latter approach normally helps ensure people are not fooled by the same trick twice.

From the very beginning, the approach was never about what’s true or false. The amount of ridiculous fake news out there pales in comparison to the amount of misleading or biased news. The problem we’re facing is going to be solved by telling people what’s fact or fiction.

What happens in Bad News is that people are asked to judge the credibility of headlines based on the presence or absence of common manipulation techniques; nobody is told what to believe in our interventions. The games are non-political insofar as people can make fun of the government or big corporations; they can troll people on the right or left. In fact, the narrator tells players that it doesn’t matter: the goal of misinformation is often just to divide people. Since Aristotle’s time, techniques have repeatedly been used to mislead people whether it’s scapegoating, false dilemmas, conspiracy theories, or fake experts. We can all agree that helping people discern these tactics is a good thing insofar as it empowers us to make up our own minds about what’s reliable and what’s not.

To listen to the audio version read by author Sander van der Linden, download the Next Big Idea App today:

Listen to key insights in the next big idea app

Download
the Next Big Idea App

app-store play-market
-->