Today I’m going to talk about why people believe stupid things. Like really stupid things. Things that are so clearly wrong that these people must have something wrong with their brains.
By that I mean that this is another article on Cognitive Biases. And by ‘these people’ I mean everybody.
Since the US in currently in the midst of its presidential primary election season, beliefs and the art of persuasion has been on my mind. The sheer distance between what each side of the aisle believes is mind boggling, which if you think about it is a bit odd.
We’re all exposed to the same information, the same news articles, the same facts. And yet, we come to vastly different conclusions.
Today I’m going to try to explain why.
The first of the biases I’ll cover in this article, and likely the most well-known. The other effects that I will discuss are types of confirmation bias.
Confirmation bias is the tendency towards preferring what we already believe. While we’d like to think that we get exposure to all different sides of an issue, we really don’t.
Because of your confirmation bias, you will tend to seek out information that confirms your beliefs. Few people watch both Fox and MSNBC, as the common view is that they are biased to the right and left respectively. (You can’t escape this even if you try, as Google personalizes their search results based on your personal interests.)
When you see new information, you will interpret that more favorably for your currently held beliefs. You’ll make excuses for the things that contradict your view and take supporting information as gospel.
You will actually remember the information that confirms your previously held beliefs better than contradictory information. That sounds nuts, but if you’ve read my article on False Memory it shouldn’t sound like much of a stretch. By remembering supporting arguments better than opposing ones, then from your perspective your views will always be the most supported.
This causes us to become very insular in the information we pay attention to, and the people we associate with. You stop reading and watching things that don’t confirm your viewpoints. You stop exposing yourself to other information because it’s uncomfortable. And it’s difficult to be friends with someone when they hold a belief that you honestly believe is delusional.
You may like to claim that you’re open-minded, but you’re probably not. Without breaking through this bias you are destined to hold the same beliefs forever.
One of the effects of the confirmation bias is…surprising. And surprisingly prevalent, so I need to call it out and explain it.
When presented with evidence against their beliefs, some people will change their views in such a way as to strengthen their already held beliefs. This is beyond just ignoring the new information: presenting a liberal with facts supporting the conservative argument will push them to become more liberal, not more moderate.
I can’t overstate this enough. Someone who believes that the Earth is flat will, when shown a picture of the spherical Earth as seen from space, react by throwing out the evidence and believing even more strongly that the Earth is definitely flat.
It defies logic, but you usually can’t change someones beliefs by facts alone. I can sit around all day and explain to you why the Earth is clearly planet-shaped, but if you were firmly locked into your belief that it was flat when we started, you will only be more entrenched in your belief when we finish
As an example a little closer to home, here is the study that first described the backfire effect (as far as I can tell at least) [PDF]. It describes how conservatives were presented with a fake article containing a quote from George W. Bush claiming the existence of weapons of mass destruction in Iraq. Some of the subjects were shown a factual correction pointing out that the quote was false and no WMDs existed. The people that read the factual correction were measured afterwards to hold stronger beliefs that the initial quote was true, despite the article pointing out the mistake.
Their beliefs were strengthened by a factually correct argument against them.
Recognizing Tactics of Persuasion
Watching politics allows us to see persuasion techniques in action. You can observe politicians that agree with your beliefs and those that disagree, and watch how the best of them work to change your belief to their side.
Importantly, you should observe how you respond to the arguments as well. Do you tend to give more weight to the arguments supporting your views?
Probably. So politicians try to use other means to persuade you to their side. When you start to look at them objectively, they stratify into a few different categories.
Some politicians are terrible, and their main tactic used are ‘gotchas’. Obviously that won’t work, that’s pure divisiveness; it will secure your base but not win over anyone. It’s like trying to convince an Englishman that the royal family isn’t important by insulting the queen. You just don’t do it.
A better politician will try to convince you that their position is defensible by using facts. Most would think that this should work, but as we’ve seen above it doesn’t. People ignore facts in favor of previously held beliefs.
A great politician already knows those don’t work. So they use other means.
A great manipulator* focuses on convincing you with rhetoric instead of facts. Often they don’t even try to convince you of specific issues, they just work to get you to believe that they themselves are trustworthy. Once they’ve given you a new belief, (how trustworthy and capable they are), then they tell you what the correct views are. If your view that they are trustworthy and capable is stronger held than your belief in whatever it is they’re trying to convince you of, they’ve won.
*Yes, manipulator has negative connotations, but I don’t intend it that way.
They know that people hear what they want to hear, so they employ strategic ambiguity to make statements on divisive topics that everyone would agree with. Here are some examples describing how Donald Trump is using that strategy.
They repeat themselves over and over because the more times you hear it the more likely you are to believe it’s true. That’s the Illusory Truth Effect, another cognitive bias.
They use a limited vocabulary to talk to the public, because using simpler language in an argument sounds more persuasive.
If you know what to look for, watching politics is like watching a competitive sport. It’s a competition to see who is the best manipulator.
I am of the belief that this is a problem.
I don’t _necessarily_ have a problem that others succumb to these tendencies, but I don’t want to be subject to them myself. I don’t want to be convinced with rhetoric and manipulation; I want my viewpoints to be closer to objective reality. Sound science and defensible logical inference should be the base upon which I build my views and beliefs.
Hopefully you feel the same.
Combatting it is difficult though. You need to pay attention to when you have no real facts backing up your argument. You need to recognise when you are experiencing cognitive dissonance: believing two contradictory things.
Those are probably signs you need to take a step back and reevaluate.
Cracking Your Naive Realism
The whole point of this article is to try to break through your perception of Naive Realism. In fact, that’s really the point of my whole Cognitive Bias series.
Naive Realism is the belief that we see the reality around us objectively, and that people who disagree must be uninformed, irrational, or intentionally biased.
Hopefully by understanding how you (and others) can believe in something delusional while still being a rational person you can start to challenge your long-held belief that you are always correct.
So here’s a question for you. And I want you to really think about it.
What are you convinced of that may be wrong?
I can guarantee you have a viewpoint that is blatantly opposed to the viewpoint of some large group of people, and those people can’t all be delusional.
Perhaps you believe that climate change is not man-made, despite the scientific consensus. Perhaps you think that the Earth is ‘planet-shaped’ despite all evidence to the contrary. Perhaps you think vaccines are safe, despite this website describing how vaccines cause autism (with sources!).
Maybe you think that democracy is the best form of government, or that the U.S. is one.
The thing is, what if you’re wrong?
If you’d like to read more about divisiveness in politics and possible solutions to the problem, check out The Centrist Manifesto.
Image courtesy of Ryan McGuire.