Quantcast
Channel: Special Articles - Committee for Skeptical Inquiry
Viewing all articles
Browse latest Browse all 856

Why We Believe —Long After We Shouldn’t

$
0
0

It’s pretty clear nowadays that we are not the rational animals we’d like to believe we are; in fact, we are more accurately called the “rationalizing animal.” Skeptics are often puzzled when we calmly provide evidence that a popular belief is wrong, that some group is holding onto a way of doing things that’s long past its sell-by date, and recipients of this valuable information don’t say, “Why, thank you! I had no idea!” Why would people prefer to justify mistaken beliefs, behavior, and practices rather than change them for better ones? Isn’t it good to know you didn’t cause your child’s autism with vaccinations?

As skeptics we are faced constantly with what psychologists call “the motivated rejection of science.” Take global warming, for example. It’s easy to assume that climate-change deniers are less educated or informed than wise scientists, but it’s not so simple. An article in Psychological Science by Stephan Lewandowsky and Klaus Oberauer found that attitudes about global warming are unrelated to levels of scientific literacy, numeracy, or education. They are associated with political partisanship; that is, among liberals, higher levels of scientific literacy and education are associated with increased acceptance of climate change, the importance of vaccination, and trust in science. But among conservatives, higher levels of scientific literacy and education are associated with reduced acceptance. That’s motivated cognition; people are emotionally motivated to reject findings that threaten their core beliefs or worldview. At present, the researchers found, public rejection of scientific findings is more prevalent on the political right than the left, yet, they added, “the cognitive mechanisms driving rejection of science are found regardless of political orientation.” Meaning: It depends what scientific finding it is. Whether your worldview comes from the left or right, you will be tempted to sacrifice skepticism even when your side is promoting some cockamamie belief without evidence.

Decades ago, the great social psychologist Gordon Allport, in his brilliant book The Nature of Prejudice, offered this exchange to illustrate the weasely way a person with a prejudice or other entrenched belief argues with you.

Mr. X: The trouble with Jews is that they only take care of their own group.

Mr. Y: But the record of the Community Chest campaign shows that they give more generously, in proportion to their numbers, to the general charities of the community, than do non-Jews.

Mr. X: That shows they are always trying to buy favor and intrude into Christian affairs. They think of nothing but money; that is why there are so many Jewish bankers.

Mr. Y: But a recent study shows that the percentage of Jews in the banking business is negligible, far smaller than the percentage of non-Jews.

Mr. X: That’s just it; they don’t go in for respectable business; they are only in the movie business or run night clubs.

Notice that people like Mr. X—which is all of us on occasion—don’t actually argue or respond to the point; they slide off your evidence and raise an irrelevant digression rather than face, let alone change, their fundamental belief. “I believe it” becomes enough.

The key motivational mechanism that underlies the reluctance to change our minds, to admit mistakes, and to be unwilling to accept unwelcome scientific findings is cognitive dissonance—the discomfort we feel when two cognitions, or cognition and behavior, contradict each other. Leon Festinger, who developed this theory sixty years ago, showed that the key thing about dissonance is that, like extreme hunger, it is uncomfortable, and, like hunger, we are motivated to reduce it. For smokers, the dissonant cognitions are “Smoking is bad for me” versus “I’m a heavy smoker.” To reduce that dissonance, smokers either have to quit or justify smoking. Before we make a decision (about a car, a candidate, or anything else), we are as open-minded as we are likely to be; but after we make a decision, we have to reduce dissonance. To do this, we will emphasize everything good about the car we bought or the candidate we are supporting or the belief we accepted and notice only the flaws in the alternatives.

Dissonance theory comprises three cognitive biases in particular:

  1. The bias that we, personally, don’t have any biases—the belief that we perceive objects and events clearly, as they really are. Any opinion I hold must be reasonable; if it weren’t, I wouldn’t hold it. If my opponents—or kids or friends or partner—don’t agree with me, it is because they are biased.
  2. The bias that we are better, kinder, smarter, more moral, and nicer than average. This bias is useful for plumping up our self-esteem, but it also blocks us from accepting information that we have been not-so-kind, not-so-smart, not-so-ethical, and not-so nice.
  3. The confirmation bias, the fact that we notice and remember information that confirms what we believe and ignore, forget, or minimize information that disconfirms it. We might even call it the consonance bias, because it keeps our beliefs in harmony by eliminating dissonant information before we are even aware of it.

Dissonance is painful enough when you realize that you bought a lemon of a car and paid too much for it. But it’s most painful when an important element of the self-concept is threatened; your post-car-purchase dissonance will be greater if you see yourself as a car expert and superb negotiator. We have two ways to reduce dissonance: either accept the evidence and change the self-concept (“Yes, that was a foolish/incompetent/unethical thing to do; was I ever wrong to believe that”) or deny the evidence and preserve the self-concept (“That study was fatally flawed”). Guess which is the popular choice?

Understanding cognitive dissonance helps explain the astonishing obstinacy that some people reveal when they are shown to be wrong. Consider the conspiracy theorists who vehemently deny the horrifying evidence that Adam Lanza killed twenty children at Sandy Hook Elementary School. They maintain it was all a conspiracy of the gun-control lobby, and they persist in that delusion even when faced by grieving parents holding photos of their beloved children. But dissonance theory explains why people can hold crazy ideas without necessarily being crazy. If we start from where the disbelievers are, holding core beliefs in the importance of owning guns, that guns are safe, and that gun-control people want to take their guns away, then information that guns were used for a rampage that left twenty little children (and six school staff) dead is powerfully dissonant. By denying the evidence that this tragedy occurred, they get to retain their gun beliefs and their self-esteem: why, they were smart and right all along to oppose gun control of any kind. Indeed, dissonance theory would predict that their opposition would become even stronger—look at the effort those bastards put into creating the fiction of Sandy Hook. They must really want to take our guns away.

The greatest danger of dissonance reduction occurs not when a belief or action is a one-time thing like buying a car, but when it sets a person on a course of action. The metaphor that we use in our book is that of a pyramid. Imagine that two students are at the top of a pyramid, a millimeter apart in their attitudes toward cheating: it is not a good thing to do, but there are worse crimes in the world. Now they are both taking an important exam, when they draw a blank on a crucial question. Failure looms, at which point each one gets an easy opportunity to cheat by reading another student’s answers. After a long moment of indecision, one spontaneously yields and the other resists. Each gains something important, but at a cost: one gives up integrity for a good grade; the other gives up a good grade to preserve his integrity.

As soon as they make a decision—to cheat or not—they will justify the action they took in order to reduce dissonance, that is, to keep their behavior consonant with their attitudes. They can’t change the behavior, so they shift their attitude. The one who cheated will justify that action by deciding that cheating is not such a big deal: “Hey, everyone cheats. It’s no big deal. And I needed to do this for my future career.” But the one who resisted the temptation will justify that action by deciding that cheating is far more immoral than he originally thought: “In fact, cheating is disgraceful. People who cheat should be expelled.” By the time they finish justifying their actions, they have slid to the bottom and now stand at opposite corners of its base, far apart from one another. The one who didn’t cheat considers the other to be totally immoral, and the one who cheated thinks the other is hopelessly puritanical—and, come to think of it, why don’t I just buy the services of a professional cheater to take the whole course for me? I really need the credits, and so what if I never learn what this class requires? I’ll learn on the job. Hey, neurosurgery can’t be that hard.

As we go through life we will find ourselves on the top of many such metaphorical pyramids, whenever we are called upon to make important decisions and moral choices: for example, whether to accept growing evidence that a decision we made is likely wrong; decide whether or not a sensational rape or murder case in the media is true; whether to blow the whistle at company corruption or decide not to rock the boat. As soon as we make a decision, we stop noticing or looking for disconfirming evidence, and we are on that path to the bottom, where certainty lies.

This process blurs the distinction that people like to draw between “us good guys” and “those bad guys,” or, occasionally in the skeptic world, “us smart, reasonable guys and those ignorant, crazy guys.” Often, when standing at the top of the pyramid we are faced not with a clear go-or-no-go decision but instead with ambiguous choices whose consequences are unknown or unknowable. We make an impulsive decision, and then we justify it to reduce the ambiguity of the choice. And soon we are trapped in a process of action, justification, and further action that increases our commitment to that first tentative decision. Taking the next step down the pyramid in that direction is almost inevitable, because otherwise we have to go back up and say, “I was wrong to take that first little step.” How do you corrupt an innocent person? How does a company or a country get enmeshed in illegal or unethical decisions? They only have to take a small step off the pyramid, and self-justification will do the rest.

Dissonance reduction has benefits, including letting us sleep at night—and besides it’s good to hold an informed opinion and not change it with every fad or every new study that comes along. But it is also essential to be able to let go of that opinion when the weight of the evidence dictates, even if we are far down that pyramid. Dissonance reduction may be built into our mental wiring, but how we think about our mistaken actions and beliefs is not.

Living with dissonance requires us to learn how to admit our mistakes and separate them from our self-esteem. Our brains may be wired for self-justification, but that is no justification for not overriding the impulse—and we can. That’s what the skeptical movement is designed to help us do: show that people can remain committed to their country, political party, friends, and family, yet understand that it is not disloyal to disagree with actions or policies or candidates we find wrong or reprehensible. And when we are faced with evidence of our own mistaken beliefs, we can learn to say: “When I, a kind and smart person, make a mistake, I remain a kind and smart person; the mistake remains a mistake. Now, how do I remedy what I did and make sure I don’t repeat it?”

Skeptics already have an immense challenge in debunking pseudoscience, con artists, and conspiracy theories; to this burden we’d add another: facing our own sources of dissonance—ambiguity, complexity, and compromise. For some on the left, “compromise” means selling out; for some on the right, “compromise” means consorting with the enemy. But no politician will do everything we want; no feminist or civil rights activist can achieve 100 percent ideological purity; no human being can be 100 percent free of bias. That may be the most dissonant message of all.


Viewing all articles
Browse latest Browse all 856

Trending Articles