Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts
Magazine / Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts

Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts

Book Bites Psychology
Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts

Have you ever done something stupid or just plain wrong, and tried to blame someone else? Sure you have. It’s natural to try to shirk responsibility for your slip-ups, but that doesn’t mean it’s a good idea. In 2007, social psychologists Carol Tavris and Elliot Aronson published a popular manual on how to identify and correct this behavior in yourself and others called Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts. Now in its third printing, it’s one of our curator Adam Grant’s favorite books.

Mistakes were made but not by me

Read on for five key insights from Mistakes Were Made (but Not by Me). To listen to the audio version of this Book Bite, download the Next Big Idea App today.

1. The means aren’t only justified at the end.

In the 1970s, a White House aide named Jeb Magruder witnessed his boss do something that gave him pause. His boss—Richard Nixon’s chief of staff Bob Haldeman—chewed out an assistant for a minor infraction, even threatening to fire the man. Magruder didn’t like how Haldeman was behaving, but he told himself it was justified. After all, Haldeman worked for the president; men in high positions shouldn’t be burdened with small annoyances.

According to Tavris and Aronson, that small decision to look past his boss’s aggressive behavior set Magruder on a path that would eventually land him in prison over the Watergate scandal. If he hadn’t gone along with small transgressions when he first took the job, he might not have been tempted into committing larger criminal acts alongside his peers.

“Having wrestled with a decision and justified it, we unconsciously commit ourselves to certain behaviors in the future, good or bad.”

When you justify your small decisions, good or bad, you start down a road that can be hard to get off of. For example, a law student may convince himself that it’s OK to cheat on the bar exam by telling himself, “I know the law, I’m just not a good test-taker.” Having justified cheating once will make it easier for him to take similar shortcuts later in his career.

In the same way, a student who has the opportunity to cheat—and chooses not to—can lock in a distaste for dishonesty. Having wrestled with a decision and justified it, we unconsciously commit ourselves to certain behaviors in the future, good or bad.

2. To fight bias, acknowledge it.

“Objects are closer than they appear” is a warning you’ve probably seen on a car’s side mirror, and it’s a good reminder that perception is not the same as reality. In fact, the same caution should be applied everywhere. Things are not as they seem to us, because we all have biases and blind spots. Pretending that we don’t can get us in trouble.

Psychologists talk about a phenomenon called naive realism. It’s our tendency to think we see the world with perfect clarity—that our ideas and beliefs are inherently reasonable, and that open-minded and fair people will agree with us.

“Surround yourself with people who challenge your ideas and hold you accountable for inaccurate or harmful notions.”

But naive realism only makes it easier for groups to become entrenched in their mindset. If a group believes they only hold reasonable ideas, then the only people who would disagree are those who are unreasonable. Rather than making us more open-minded, naive realism encourages us to double down on our biases, and strengthens our tendency to think in adversarial terms about those who are different, especially during times of economic or social distress. For example, in some Western countries, fear of COVID-19 increased the animosity expressed toward Chinese people and others of Asian descent. Not every prejudiced person commits overt discrimination, of course, but the more others in the community agree with that bias, the more emboldened a prejudiced person will become in his viewpoint.

So how can we combat our naive realism and root out our biases? One way is to cultivate a wide social circle with diverse points of view. Surround yourself with people who challenge your ideas and hold you accountable for inaccurate or harmful notions.

3. Memory is a reconstruction, not a recollection.

Movies, television shows, and other forms of media tend to present memories as exact replications of previous events. Like a video camera, the mind is depicted as a machine that exactly records everything it witnesses. But the human mind can’t possibly record memory in this way. Our brains are constantly bombarded by information, most of which we instantly dismiss, forget, or edit. The moments we’re quickest to forget or alter are those which conflict with our view of ourselves. That’s why, when telling an anecdote, people tend to highlight the parts of the story that make them look good and leave out the other bits.

“The moments we’re quickest to forget or alter are those which conflict with our view of ourselves.”

According to Tavris and Aronson, inaccurate recollections aren’t lies so much as they are products of self-justification and confabulation. Tavris, for example, still recalls her father reading to her from her favorite children’s book. That memory, though, is undoubtedly a fabrication, since she discovered years later that the book she remembers her father reading was published after his death. At some point, Tavris’s love for her father got mingled with the memory of someone else reading her that book, leaving her with an inaccurate, albeit emotional, recollection.

Memories that are important to a person’s sense of self are even more subject to revision and false recollection. When test subjects are asked to recall times when they did something antithetical to their current values, they tend to distort the memory, retelling it from a third-person perspective and removing themselves as a central actor in the scene. Ultimately, this attempt to dissociate from one’s own actions allows the person to avoid the hard work of integrating who they once were with who they are now.

4. Respect comes to those who fess up.

Self-justification isn’t always bad. As Tavris and Aronson point out, it can be a healthy way to reduce the mental anguish that comes with learning you made a mistake. But it takes a lot of energy to protect your ego in this way, and sometimes it’s less painful to simply own up to the mistake.

Taking responsibility for missteps isn’t easy, but when done correctly, it can be an act of healing, and can even earn you respect and credibility. Leaders who are willing to accept blame and correct their mistakes are generally viewed as more trustworthy and capable of command.

Of course, when you admit responsibility and apologize for your actions, there are no guarantees it will be taken the way you want. People who don’t like you may see your apology as a sign of weakness, or they may consider it false or forced. But a sincere apology that avoids displacing blame or making excuses has the potential to sway those who have been wronged, and give you peace of mind.

“Leaders who are willing to accept blame and correct their mistakes are generally viewed as more trustworthy and capable of command.”

5. Dissonance is normal, but not universal.

Everyone makes mistakes, but not everyone admits it. Most of us would rather blame someone else or explain away the mistake, especially if it conflicts with our self-image. This is what’s called cognitive dissonance, and it’s what happens when reality conflicts with your sense of the world. “I’m a good and competent person,” you tell yourself. “So that couldn’t really have been my fault.”

Cognitive dissonance may be the usual consequence of committing socially questionable acts, but that doesn’t mean that everyone experiences dissonance the same way. Political demagogues, for example, may feel very little cognitive dissonance because they think they’re always in the right.

And there are some people who feel too much cognitive dissonance. For example, soldiers who have had to commit acts of violence in combat can develop depression or post-traumatic stress disorder because their wartime actions conflict with their self-concept. This extreme dissonance is experienced as a trauma that usually requires therapy and self-forgiveness to heal.

People who suffer from low self-esteem are also likely to experience cognitive dissonance differently than their more confident peers. You see this in people who can’t accept a compliment, receive praise, or admit their talents. Because they see themselves as failures, they’re quick to dismiss any evidence of their achievements. Confronting this false image may be momentarily painful, but the relief it will yield is well worth the effort.

To listen to the audio version of this Book Bite and many more, download the Next Big Idea App today:

Listen to key insights in the next big idea app

Download
the Next Big Idea App

app-store play-market

Also in Magazine

-->