By Ozan Varol
If you had asked me this question – How do you change a mind? – two years ago, I would have given you a different answer.
As a former scientist, I would have cautioned you to rely on objective facts and statistics. Develop a strong case for your side, back it up with hard, cold, irrefutable data, and voila!
Drowning the other person with facts, I assumed, was the best way to prove that global warming is real, the war on drugs has failed, or the current business strategy adopted by your risk-averse boss with zero imagination is not working.
Since then, I’ve discovered a significant problem with this approach.
It doesn’t work.
The mind doesn’t follow the facts. Facts, as John Adams put it, are stubborn things, but our minds are even more stubborn. Doubt isn’t always resolved in the face of facts for even the most enlightened among us, however credible and convincing those facts might be.
As a result of the well-documented confirmation bias, we tend to undervalue evidence that contradicts our beliefs and overvalue evidence that confirms them. We filter out inconvenient truths and arguments on the opposing side. As a result, our opinions solidify, and it becomes increasingly harder to disrupt established patterns of thinking.
We believe in alternative facts if they support our pre-existing beliefs. Aggressively mediocre corporate executives remain in office because we interpret the evidence to confirm the accuracy of our initial hiring decision. Doctors continue to preach the ills of dietary fat despite emerging research to the contrary.
If you have any doubts about the power of the confirmation bias, think back to the last time you Googled a question. Did you meticulously read each link to get a broad objective picture? Or did you simply skim through the links looking for the page that confirms what you already believed was true? And let’s face it, you’ll always find that page, especially if you’re willing to click through to Page 12 on the Google search results.
If facts don’t work, how do you change a mind – whether it’s your own or your neighbor’s?
We’re reluctant to acknowledge mistakes. To avoid admitting we were wrong, we’ll twist ourselves into positions that even seasoned yogis can’t hold.
The key is to trick the mind by giving it an excuse. Convince your own mind (or your friend) that your prior decision or prior belief was the right one given what you knew, but now that the underlying facts have changed, so should the mind.
But instead of giving the mind an out, we often go for a punch to the gut. We belittle the other person (“I told you so”). We ostracize (“Basket of deplorables”). We ridicule (“What an idiot”).
Schadenfreude might be your favorite pastime, but it has the counterproductive effect of activating the other person’s defenses and solidifying their positions. The moment you belittle the mind for believing in something, you’ve lost the battle. At that point, the mind will dig in rather than give in. Once you’ve equated someone’s beliefs with idiocracy, changing that person’s mind will require nothing short of an admission that they are unintelligent. And that’s an admission that most minds aren’t willing to make.
Democrats in the United States are already falling into this trap. They’re not going to win the 2020 presidential elections by convincing Donald Trump supporters that they were wrong to vote for him last November or that they’re responsible for his failures in office. Instead, as author and psychology professor Robert Cialdini explains, Democrats must offer Trump supporters a way to get out of their prior commitment while saving face: “Well, of course you were in a position to make that decision in November because no one knew about X.”
Colombians adopted a similar strategy in the 1950s when the Rojas dictatorship collapsed. As I explain in my forthcoming book, although the Colombian military was complicit in the abuses of the Rojas regime, civilians deftly avoided pointing any fingers at the military. Instead, they managed to march the military back to the barracks with its dignity intact. They recognized that they would need the military’s cooperation both during the transition process and in its aftermath. So they offered an alternative narrative for public consumption that uncoupled the armed forces from the Rojas regime. In this narrative, which the military leaders found much easier to swallow, it was the “presidential family” and a few corrupt civilians close to Rojas — not military officers — who were responsible for the regime’s excesses. Were they to take a different approach, a military dictatorship—not democracy—may have resulted.
In my early years in academia, I would tend to get defensive when someone challenged one of my arguments during a presentation. My heart rate would skyrocket, I would tense up, and my answer would reflect the disdain with which I viewed the antagonistic question (and the questioner).
I know I’m not alone here. We all tend to identify with our beliefs and arguments.
This is my business.
This is my article.
This is my idea.
But here’s the problem. When your beliefs are entwined with your identity, changing your mind means changing your identity. That’s a really hard sell.
A possible solution, and one that I’ve adopted in my own life, is to put a healthy separation between you and the products of you. I changed my vocabulary to reflect this mental shift. At conferences, instead of saying, “In this paper, I argue …,” I began to say “This paper argues … ”
This subtle verbal tweak tricked my mind into thinking that my arguments and me were not one and the same. Obviously, I was the one who came up with these arguments, but once they were out of my body, they took a life of their own. They became separate, abstract objects that I could view with some objectivity.
It was no longer personal. It was simply a hypothesis proven wrong.
Playing Al Gore’s Inconvenient Truth on repeat to a room of Detroit auto workers won’t change their mind on global warming if they’re convinced your agenda will put them out of a job.
Humans operate on different frequencies. If someone disagrees with you, it’s not because they’re wrong, and you’re right. It’s because they believe something that you don’t believe.
The challenge is to figure out what that thing is and adjust your frequency. If employment is the primary concern of the Detroit auto worker, showing him images of endangered penguins (as adorable as they may be) or Antarctica’s melting glaciers will get you nowhere. Instead, show him how renewable energy will provide job security to his grandchildren. Now, you’ve got his attention.
We live in a perpetual echo chamber. We friend people like us on Facebook. We follow people like us on Twitter. We read the news outlets that are on the same political frequency as us.
This means our opinions aren’t being stress tested nearly as frequently as they should.
Make a point to befriend people who disagree with you. Expose yourself to environments where your opinions can be challenged, as uncomfortable and awkward as that might be.
Marc Andreessen has a saying that I love: “Strong beliefs, loosely held.” Strongly believe in an idea, but be willing to change your opinion if the facts show otherwise.
Ask yourself, “What fact would change one of my strongly held opinions?” If the answer is “no fact would change my opinion,” you’re in trouble. A person who is unwilling to change his or her mind even with an underlying change in the facts is, by definition, a fundamentalist.
In the end, it takes courage and determination to see the truth instead of the convenient.
But it’s well worth the effort.
Ozan Varol is a rocket scientist turned law professor and bestselling author. Click here to download a free copy of his e-book, The Contrarian Handbook: 8 Principles for Innovating Your Thinking. Along with your free e-book, you’ll get the Weekly Contrarian — a newsletter that challenges conventional wisdom and changes the way we look at the world (plus access to exclusive content for subscribers only).
Originally published at www.theladders.com