Tag Archives: motivated reasoning

Denial Isn’t Just A River In Egypt

Sorry about that bad pun, but these day, even pathetic humor is a respite from the daily news…                                                                          
                                                                    
And speaking of the daily news–according to one recent report on the pandemic, new cases have increased by 84% in states that don’t require the wearing of masks, and fallen by 25% in states that do.
 
You might consider that a clue-just a small hint that we should trust science.

After all, those numbers would seem to confirm what all those doctors and epidemiologists have been saying: mask-wearing protects us (or more accurately, protects other people from being infected by those of us who are asymptomatic). Evidently, however, America’s tribal polarization has overwhelmed sanity.
 
The polls tell us that a sizable majority of Americans strongly favor measures to control the spread of the pandemic over efforts to “reopen” the economy. When those numbers are broken down, however, Republican voters disagree—prioritizing the economy.
 
Self-identified Democrats are significantly more likely to wear a mask and engage in social distancing than self-identified Republicans.
 
The polling reminds me of a survey I saw a couple of years ago—well before the pandemic—in which significant numbers of Americans who would not object to their children marrying across racial or religious lines strongly disapproved of the prospect of that child marrying someone of the opposite political party.
 
Talk about “identity politics”!
 
In today’s highly polarized America, an individual’s self-identification as Republican or Democrat has come to signify a wide range of attitudes and beliefs not necessarily limited to support for a political party. Political scientist Lilliana Mason has argued that “A single vote can now indicate a person’s partisan preferences as well as his or her religion, race, ethnicity, gender, neighborhood and favorite grocery store.”

Democrat and Republican have become our new mega-identities.
 
The fact of extreme partisan polarization doesn’t, however, explain why identifying as Republican means being substantially less likely to believe the science that tells us Covid-19 poses a genuine threat. Of course, there’s President Trump’s determination to ignore the threat—to insist it is an artifact of testing (!), or a Democratic “hoax,” but in a recent New York Times column, Paul Krugman offered a different theory, arguing that the G.O.P.’s coronavirus denial is rooted in a worldview that goes well beyond Trump and his electoral prospects. Krugman argued that Covid-19 is like climate change: It isn’t the kind of menace the party wants to acknowledge.
 
“It’s not that the right is averse to fearmongering. But it doesn’t want you to fear impersonal threats that require an effective policy response, not to mention inconveniences like wearing face masks; it wants you to be afraid of people you can hate — people of a different race or supercilious liberals.”
 
As Adrian Bardon of Wake Forest University recently wrote in The Conversation, Americans increasingly exist in highly polarized, informationally insulated ideological communities occupying their own information universes, and engage in what political scientists call “motivated reasoning” to dismiss inconvenient or unwelcome facts.

In all fairness, this phenomenon isn’t limited to today’s GOP; the “anti-vaxxers” and “anti-GMO” activists tend to come from the left side of the political spectrum and are equally dismissive of science that doesn’t fit with their ideological preferences.
 
In his book, The Truth About Denial, Bardon reminds us that our human “sense of self” is intimately tied to our tribal membership and our identity group’s beliefs. We are all prone to engage in confirmation bias (what we used to call “cherry picking”), accepting expert testimony that confirms our prejudices and rejecting facts and data that contradict them.
 
Unfortunately, in some situations, ignoring facts can kill you. Or grandma.
 
 
 
 
 
 
 
 
 

 

How a cognitive failing explains why so many people reject the facts about the pandemic

Messing With Our Minds

As if the websites peddling conspiracy theories and political propaganda weren’t enough, we now have to contend with “Deepfakes.” Deepfakes, according to the Brookings Institution, are 

videos that have been constructed to make a person appear to say or do something that they never said or did. With artificial intelligence-based methods for creating deepfakes becoming increasingly sophisticated and accessible, deepfakes are raising a set of challenging policy, technology, and legal issues.

Deepfakes can be used in ways that are highly disturbing. Candidates in a political campaign can be targeted by manipulated videos in which they appear to say things that could harm their chances for election. Deepfakes are also being used to place people in pornographic videos that they in fact had no part in filming.

Because they are so realistic, deepfakes can scramble our understanding of truth in multiple ways. By exploiting our inclination to trust the reliability of evidence that we see with our own eyes, they can turn fiction into apparent fact. And, as we become more attuned to the existence of deepfakes, there is also a subsequent, corollary effect: they undermine our trust in all videos, including those that are genuine. Truth itself becomes elusive, because we can no longer be sure of what is real and what is not.

The linked article notes that researchers are trying to devise technologies to detect deep fakes, but until there are apps or other tools that will identify these very sophisticated forgeries, we are left with “legal remedies and increased awareness,” neither of which is very satisfactory.

We already inhabit an information environment that has done more damage to social cohesion than previous efforts to divide and mislead. Thanks to the ubiquity of the Internet and social media (and the demise of media that can genuinely be considered “mass”), we are all free to indulge our confirmation biases–free to engage in what a colleague dubs “motivated reasoning.” It has become harder and harder to separate truth from fiction, moderate spin from outright propaganda.

One result is that thoughtful people–people who want to be factually accurate and intellectually honest–are increasingly unsure of what they can believe.

What makes this new fakery especially dangerous is that, as the linked article notes, most of us do think that “seeing is believing.” We are far more apt to accept visual evidence than other forms of information. There are already plenty of conspiracy sites that offer altered photographic “evidence”–of the aliens who landed at Roswell, of purportedly criminal behavior by public figures, etc. Now people intent on deception have the ability to make those alterations virtually impossible to detect.

Even if technology is developed that can detect fakery, will “motivated” reasoners rely on it?

Will people be more likely to believe a deepfake or a detection algorithm that flags the video as fabricated? And what should people believe when different detection algorithms—or different people—render conflicting verdicts regarding whether a video is genuine?

We are truly entering a new and unsettling “hall of mirrors” version of reality.

Ideology and the Informed Voter

Well, this is depressing.

We like to think that more informed voters are “better” voters–more likely to make reasoned decisions, more likely to base those decisions on evidence rather than emotion or prejudice.

We’d like to think that, but apparently we’d be wrong. Research increasingly confirms that more information does not necessarily translate into better judgment.

An informed voter is only as good as her information sources. And because we all get to choose which information sources to believe, voters with more information are not always more informed. Sometimes, they’re just more completely and profoundly misled.

Looking at the 1996 election, for instance, Achens and Bartels studied whether voters knew the budget deficit had dropped during President Clinton’s first term (it had, and sharply). What they found will shake anyone who believes more information leads to a smarter electorate: how much voters knew about politics mattered less than which party they supported. Republicans in the 80th percentile of political knowledge were less likely to answer the question correctly than Democrats in the 20th percentile of political knowledge.

It gets worse: Republicans in the 60th percentile of political knowledge were less likely to answer the question correctly than Republicans in the 10th percentile of political knowledge — which suggests that at least some of what we learn as we become more politically informed is how to mask our partisanship.

This is all part of what political scientists call “motivated reasoning”–the very human tendency to filter information through our personal worldviews.

Those of us who follow politics most closely do so because we care about issues of governance and have developed value structures and perspectives through which we analyze the information we acquire. The more invested we are in a particular approach to an issue, the more likely we are to apply our ideological “spin” to information about that issue.

It seems counter-intuitive, but it may be that voters who are less  invested in partisan politics and political philosophy–who don’t have a dog in the fight, as the saying goes– are actually more likely to cast votes based upon more or less dispassionate evaluations of the candidates and their campaigns.

If so, the more people who vote, the better.

About Those “Liberal” Professors

One of my graduate students pointed me to an interesting article in the Chronicle of Higher Education, highlighting a study into the persistent accusation that “liberal” professors are guilty of politically indoctrinating their students.

Dodson’s analysis of the data shows that students who get engaged academically are likely to increase their time talking about political issues and becoming engaged in civic life.

With regard to political views, academic engagement promoted moderation. “[T]he results indicate — in contrast to the concerns of many conservative commentators — that academic involvement generally moderates attitudes,” Dodson writes. “While conservative students do become more liberal as a result of academic involvement, liberals become more conservative as a result of their academic involvement. Indeed it appears that a critical engagement with a diverse set of ideas — a hallmark of the college experience — challenges students to re-evaluate the strength of their political convictions.”

The data on student activities demonstrate the opposite impact: The more involved that liberal students get, the more liberal they become, while the more involved conservative students get, the more conservative they become.”This finding suggests that students seek out and engage with familiar social environments — a choice that leads to the strengthening of their political beliefs.”

This research is consistent with a study I saw a few years ago: when people who were moderately inclined to believe X were placed in a discussion group with others who all believed X, they emerged from the experience much more invested in X. People who participated in more diverse discussions–who were placed in groups representing a range of positions on X–developed more nuanced (and less dogmatic) opinions about X.

It all comes back to what academics call motivated reasoning… the willingness of people invested in a particular worldview to choose the news and select the information environments that reinforce their pre-existing beliefs.

A good teacher provides students with a wide range of relevant information, at least some  of which will inevitably challenge their worldviews. As I tell my students, it’s my job to confuse you. I’ll know I’ve succeeded if, after taking my class, students use two phrases more frequently: “it depends,” and “it’s more complicated than that.”

Because, really–it is more complicated than that.