Tag Archives: confirmation bias

Just the Facts…

I guess we no longer need the “big lie.” We Americans–for that matter, people everywhere– are perfectly comfortable simply rejecting facts that make us uncomfortable, or otherwise conflict with our preferred realities.

I’ve previously blogged about the emerging academic literature on confirmation bias.  A reader sent me an article from the Boston Globe summarizing much of that literature.

Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

 

Needless to say, this is a real problem for democratic theory, which places a high value on an informed populace.

 

This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

As the author notes, we humans tend to base our opinions on our beliefs–and those beliefs can have what he delicately calls “an uneasy relationship” with facts. Although we like to believe that we base our beliefs on evidence and fact, research suggests that our beliefs all too often dictate the facts we’re willing to accept.

 

Sometimes we just twist facts to make them fit with our preferred beliefs; at other times our preconceptions lead us to uncritically accept rumor, misinformation and outright propaganda if those reinforce our worldviews or confirm our resentments and/or suspicions.

 

The phenomenon is certainly not limited to the political right, but the most recent glaring examples do come from the GOP “clown car.”  Donald Trump insists that he saw “thousands of Muslims” cheering when the World Trade Center came down, even though everyone in a positions to know says that never happened. Ben Carson “quotes” America’s founders for statements they never made (and in some cases, expressing sentiments diametrically opposed to what they actually did say.) Carly Fiorina insists that she viewed a video that doesn’t exist. And people who want to believe them, do.

 

As the Globe article put it, thanks to the internet, “it’s never been easier for people to be wrong, and at the same time feel more certain that they’re right.”

 

Identifying the problem and solving it are two different issues. To date, there has been progress on identifying the phenomenon, less on what we need to do to counter it. That said, researchers are working on it.

 

One avenue may involve self-esteem. In other words, if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t. This would also explain why demagogues benefit from keeping people agitated. The more threatened people feel, the less likely they are to listen to dissenting opinions, and the more easily controlled they are.

 

No wonder those of us advocating for evidence-based public policies are having such a bad time…..

Ignorance is One Thing, Anti-Knowledge Another

I’ve run across several columns/posts recently focused on a distinction–one that is gaining in importance–between Ignorance and anti-knowledge, or what we might call intentional or stubborn ignorance. In the aftermath of yet another presidential debate, the distinction merits consideration.

As Lee McIntyre put it in last Sunday’s New York Times,

We’ve all heard the phrase “you’re entitled to your own opinion, but not your own facts.” Opinions are the sorts of things about which we can take a poll. They are sometimes well-informed, but rarely expected to be anything other than subjective. Facts, on the other hand, are “out there” in the world, separate from us, so it makes little sense to ask people what they think of them. As the comedian John Oliver so aptly put it… “You don’t need people’s opinion on a fact. You might as well have a poll asking: ‘Which number is bigger, 15 or 5?’ Or ‘Do owls exist’ or ‘Are there hats?’”

McIntyre distinguishes between skepticism–withholding belief because the evidence does not live up to the standards of science–from denialism, which is the refusal to believe something even in the face of what most reasonable people would take to be compelling evidence.

At Dispatches from the Culture Wars, Ed Brayton has a similar rumination on the phenomenon he calls “virulent ignorance,” and quotes from an article by former congressional staffer Mike Lofgren:

Fifty years ago, if a person did not know who the prime minister of Great Britain was, what the conflict in Vietnam was about, or the barest rudiments of how a nuclear reaction worked, he would shrug his shoulders and move on. And if he didn’t bother to know those things, he was in all likelihood politically apathetic and confined his passionate arguing to topics like sports or the attributes of the opposite sex.
There were exceptions, like the Birchers’ theory that fluoridation was a monstrous communist conspiracy, but they were mostly confined to the fringes. Certainly, political candidates with national aspirations steered clear of such balderdash.

At present, however, a person can be blissfully ignorant of how to locate Kenya on a map, but know to a metaphysical certitude that Barack Obama was born there, because he learned it from Fox News. Likewise, he can be unable to differentiate a species from a phylum but be confident from viewing the 700 Club that evolution is “politically correct” hooey and that the earth is 6,000 years old….

Anti-knowledge is a subset of anti-intellectualism, and as Richard Hofstadter has pointed out, anti-intellectualism has been a recurrent feature in American life, generally rising and receding in synchronism with fundamentalist revivalism…

 To a far greater degree than previous outbreaks, fundamentalism has merged its personnel, its policies, its tactics and its fate with a major American political party, the Republicans.

Buttressing this merger is a vast support structure of media, foundations, pressure groups and even a thriving cottage industry of fake historians and phony scientists. From Fox News to the Discovery Institute (which exists solely to “disprove” evolution), and from the Heritage Foundation (which propagandizes that tax cuts increase revenue despite massive empirical evidence to the contrary) to bogus “historians” like David Barton (who confected a fraudulent biography of a piously devout Thomas Jefferson that had to be withdrawn by the publisher), the anti-knowledge crowd has created an immense ecosystem of political disinformation.

I think it is this support structure that is most worrisome, because it enables what political psychologists call “confirmation bias,” the tendency we all share to look for evidence that confirms our pre-existing opinions.

Thanks to modern technologies, any crank or ideologue can create the “evidence” we desire–at least, if we aren’t too fussy about what constitutes evidence.
There’s nothing wrong with genuine ignorance; it can be corrected with credible information. Intentional, stubborn, “faith-based” ignorance, on the other hand, will destroy us.

What We Know That Just Ain’t So

I forget the source of this old quote, but I’ve always liked it: “The problem ain’t what we don’t know, it’s what we know that just ain’t so.”

Recently, a regular reader sent me an article from “NeuroLogica Blog” (there’s obviously a blog for everything) that documented that hoary saying.

When asked what percentage of the population is Muslim the average answer was 15% when the reality is 1%. How many people are Christian: average answer 56%, reality 78%. How many people of working age are out of work and seeking a job: average answer 32%, reality 6% (at the time of the survey). That one seems strange. Did people really think the unemployment rate was 32% (that was average, which means some people thought it was higher)? During the great depression the unemployment rate peaked at 25%. What percentage of girls between 15 and 19 years old will give birth: average guess 24%, reality 3%.

As the author noted, the interesting (indeed, the pertinent) question is – why are so many people so misinformed about the facts? After all, these are verifiable and concrete data points, not “facts” that are really value judgments like “socialism is bad” or “religion is good.” And as the author also noted, the internet makes it incredibly easy to locate and verify these facts.

The article listed “the usual subjects”–education that doesn’t sufficiently teach critical thinking skills, a fragmented and frequently lazy media, politicians whose spin (and outright lies) are rewarded. All of these are implicated, but perhaps the best explanation is confirmation bias.

…the tendency to notice, accept, and remember information which confirms your existing narrative. The fact that we have narratives also is a huge factor. There is a tendency to latch onto themes and narratives, and then use facts to support those narratives, rather than to alter our narratives based on the facts. It is therefore no surprise that facts which have political implications have been so distorted to fit political narratives.

In other words, confirmation bias convinces us of things that we want to believe, but that “just ain’t so.”

And we wonder why Americans can’t find common ground.