[bctt tweet=”“A lot of good arguments are spoiled by some fool who knows what he is talking about.” —Miguel de Unamuno” username=”inipatrick”]
A man is convinced he’s dead. His wife tells him he’s not dead. His kids tell him he’s not dead.
His friends tell him he’s not dead.
But he still insists he is.
Finally, at their wits’ end, the family talks him into seeing a psychiatrist.
The psychiatrist decides that his best strategy for curing the man is to convince him to accept one fact: Dead men do not bleed.
Over the next six months, he has the man study medical textbooks and anatomy charts, all of which support the psychiatrist’s point.
Then, he has the man observe autopsies, dissect cadavers, and work as an assistant in a funeral home.
Finally, the exasperated man says to the psychiatrist, “Okay. Enough already. I get it: Dead men do not bleed.”
The psychiatrist smiles, and then takes the man’s hand and pricks the end of his finger with a needle.
As a drop of blood oozes out, the man looks at his finger and says, “Well, what do you know! Dead men do bleed!”
Related Post: The Irony of Great Expectation
[bctt tweet=”Logic doesn’t always win out.” username=”inipatrick”]
Studies have proven that we’ll often ignore facts—or twist them—if they don’t support our views.
So, if you’re trying to win an argument and expect to get the other person to change their mind, you’re probably wasting your time.
You may be right, but being right has nothing to do with it—people believe
what they want to believe.
BEYOND THE PUNCH LINE
It might seem crazy that a person would cling to a belief in the face of incontrovertible evidence, but it’s so common that scientists have coined a term for it: “the backfire effect.”
The reason: Attempting to correct people’s misconceptions will often backfire and bind them even more strongly to their erroneous beliefs.
- This has been demonstrated many times. In a 2006 study, for example, subjects were given two newspaper articles: The first one supported their beliefs with incorrect data; the second one corrected the misinformation. Rather than changing their views in any way, subjects decided the second article was incorrect . . . and even tended to see a conspiracy behind the correction.
- It’s not merely anecdotal: Brain scans have confirmed these results. When subjects in an MRI test were shown information that confirmed what they already thought about a particular topic, areas of the brain associated with learning lit up. But when the same subjects were shown info that contradicted their beliefs, sections of the brain that are associated with “thought suppression” lit up. This illustrates how difficult it is to change anyone’s mind once it’s made up. It’s not just about logic; it’s biology.
- So, is it possible to change people’s minds at all? Maybe. Researchers say:
- People will listen to (though not necessarily accept) new information if an issue affects their lives directly. If they’ve got no skin in the game, they’ll tune it out and stick with prior beliefs.
- People are more likely to weigh new evidence seriously (and even change their views) when they’re deliberating with a small group. In one study, people working in small groups changed their minds 75 percent more frequently than when deliberating alone.
- People who are asked to explain how their ideas work in the real world may realize how little they know about a subject and modify their beliefs. On the other hand, people who are asked to explain why they believe something are likely to become more adamant about their beliefs.