The New Censorship

One of the many causes of increased tribalism and chaos worldwide is the unprecedented nature of the information environment we inhabit. A quote from Yuval Noah Harari’s Homo Deus is instructive–

In the past, censorship worked by blocking the flow of information. In the twenty-first century, censorship works by flooding people with irrelevant information.

We are only dimly beginning to understand the nature of the threat posed by the mountains of “information” with which we are inundated. Various organizations are mounting efforts to fight that threat–to increase news literacy and control disinformation– with results that are thus far imperceptible.

The Brookings Institution has engaged in one of those efforts; it has a series on Cybersecurity and Election Interference, and in a recent report, offered four steps to “stop the spread of disinformation.” The linked report begins by making an important point about the actual targets of such disinformation.

The public discussion of disinformation often focuses on targeted candidates, without recognizing that disinformation actually targets voters. In the case of elections, actors both foreign and domestic are trying to influence whether or not you as an individual vote, and for whom to cast your ballot. The effort goes farther than elections: it is about the information on whether to vaccinate children or boycott the NFL. What started with foreign adversaries now includes domestic groups, all fighting for control over what you believe to be true.

The report also recognizes that the preservation of democratic and economic institutions in the digital era will ultimately depend on efforts to control disinformation by  government and the various platforms on which it is disseminated. Since the nature of the necessary action is not yet clear–so far as I can tell, we don’t have a clue how to accomplish this– Brookings says that the general public needs to make itself less susceptible, and its report offers four ways to accomplish that.

You’ll forgive me if I am skeptical of the ability/desire of most Americans to follow their advice, but for what it is worth, here are the steps they advocate:

Know your algorithm
Get to know your own social media feed and algorithm, because disinformation targets us based on our online behavior and our biases. Platforms cater information to you based on what you stop to read, engage with, and send to friends. This information is then accessible to advertisers and can be manipulated by those who know how to do so, in order to target you based on your past behavior. The result is we are only seeing information that an algorithm thinks we want to consume, which could be biased and distorted.

Retrain your newsfeed
Once you have gotten to know your algorithm, you can change it to start seeing other points of view. Repeatedly seek out reputable sources of information that typically cater to viewpoints different than your own, and begin to see that information occur in your newsfeed organically.

Scrutinize your news sources
Start consuming information from social media critically. Social media is more than a news digest—it is social, and it is media. We often scroll through passively, absorbing a combination of personal updates from friends and family—and if you are among the two-thirds of Americans who report consuming news on social media—you are passively scrolling through news stories as well. A more critical eye to the information in your feed and being able to look for key indicators of whether or not news is timely and accurate, such as the source and the publication date, is incredibly important.

Consider not sharing
Finally, think before you share. If you think that a “news” article seems too sensational or extreme to be true, it probably is. By not sharing, you are stopping the flow of disinformation and falsehoods from getting across to your friends and network. While the general public cannot be relied upon to solve this problem alone, it is imperative that we start doing our part to stop this phenomenon. It is time to stop waiting for someone to save us from disinformation, and to start saving ourselves.

All good advice. Why do I think the people who most need to follow it, won’t?

Comments

Gains and (Huge) Losses

In age of internet, I worry that it is no longer possible to have a truly national conversation.

The ability of social media platforms to target recipients for advertising and other information based upon sophisticated analyses of individual preferences threatens the very existence of a genuinely public sphere in which a true First Amendment marketplace of ideas might operate. As one scholar of the media despairingly asked, “How can you cure the effects of ‘bad’ speech with more speech when you have no means to target the same audience that received the original message?”

We are clearly in uncharted waters.

As regular readers of this blog know, I teach a course in Media and Public Affairs. It used to be titled “Mass Media and Public Affairs;”  the name change reflects a change in the reality of our methods of communication: there’s no truly “mass” media anymore.

Subject-matter covered in the course has morphed along with the media it studies. When then-Dean of Journalism Jim Brown and I began team teaching it more than a decade ago, our goal was relatively simple–introduce Journalism students to policy formation (so they would better understand how coverage of government affects policy), and help public affairs students understand the difference between what journalists consider “news” and thus worthy of coverage, and garden-variety policy argumentation.

Over the years, the media environment has fragmented and dramatically changed, and so has the course. Today, it focuses on the role of media in a democratic society, beginning with the assumption that the ability of citizens to participate in the democratic process on the basis of informed decisions is heavily dependent upon the quality, factual accuracy, objectivity and completeness of the information available to them. We examine the responsibility of the “fourth estate” to the public it serves, and the role of media in the American political system.

We look at the legal and ethical constraints that should apply to a free press, the business pressures that affect reporting, the impact of technology and social media, the role of political pundits, the challenges of issue framing, the impact of American diversity on the profession of journalism and–with increasing urgency– how to assess the credibility of the innumerable “news” resources available to us.

We also consider the dramatic collapse of what has come to be called “legacy journalism,”  and the consequences of the current information environment for democratic and accountable governance.

Throughout the class, I keep coming back to that one core issue: how the incommensurate realities and filter bubbles we inhabit (thanks to both confirmation bias and the wildly different sources of information that are available to us) make it increasingly impossible to have a genuinely public discussion.

I think it was media historian Paul Starr who said that a public is different from an audience. An audience is fine for entertainment; a democratic polity, however, requires a public, and I’m not sure we have one anymore.

There is so much that is wonderful about the Internet; the technology has made unlimited information immediately available to us. It has allowed in-depth explorations, introduced dramatically diverse people to each other, made the arts accessible, allowed the human imagination to soar. (It has also made shopping infinitely more convenient…)

On the other hand, it has destroyed the business model that sustained most local newspapers–a grievous loss for multiple reasons, including the way that loss has influenced trust in media generally. As Michelle Goldberg recently wrote in the New York Times,

In general, people trust local papers more than the national media; when stories are about your immediate community, you can see they’re not fake news. Without a trusted news source, people are more vulnerable to the atmosphere of disinformation, cynicism and wild conspiracy theories in which fascism — and Trumpism — flourishes. Politico found that “Voters in so-called news deserts — places with minimal newspaper subscriptions, print or online,” voted for Trump in higher-than-expected numbers, even accounting for employment and education.

We live in a world of Kardashians and clickbait, Infowars and propagandists, cute kittens and adorable babies and weird cookie recipes–a world of inadequate coverage of local governments and overwhelmingly partisan coverages of national issues. In that world we inhabit, the American public has devolved into a variety of audiences–and lost most of the common ground necessary to exist as a public.

No wonder we’re polarized.

Comments

Baffle Them With Bullshit

The BBC recently opined that the goal of all those Russian bots and trolls isn’t to convince Americans of any particular fact or position–it’s to bombard us with so many competing versions of everything that nothing makes sense.

The observation reminded me of the old saying, “If you can’t convince them with your arguments, baffle them with your bullshit.”

CNN recently ran a story with a similar premise: the title was “Why Russian Trolls May be More Excited That the NFL is Back Than You Are.

The same Kremlin-linked group that posed as Americans on social media during the 2016 US presidential election has repeatedly exploited the controversy surrounding the NFL and players who have protested police brutality and racial injustice during the National Anthem, playing both sides in an effort to exacerbate divides in American society.

The debate is almost certainly an irresistible one for the Russians, given that it includes issues of race, patriotism, and national identity — topics the Russian trolls sought to exploit during the run-up to the election, and have continued to focus on in the two years since.

Propaganda in the age of the Internet has gotten far more sophisticated, and the goals it pursues are no longer limited to winning a particular debate or political campaign. The changes really started in earnest with Big Tobacco–the PR firms trying to head off new regulations realized that a frontal attack on the medical science showing that smoking is linked to cancer would fail, because contrary scientific studies paid for by the tobacco companies wouldn’t be seen as credible.

Instead, they hit on a tactic that has since been used to great effect by  other special interests, most notably fossil fuel companies denying climate change: they claimed that the evidence was still “inconclusive” and Congress should wait for more information before acting. Encouraging confusion was far more effective than attacking the science. The tactic played into the reluctance of lawmakers to pick a side in contentious debates.

It’s even easier for the Russians, because their goal is simply to divide us. They don’t care which side “wins” a debate–their goal is to add fuel to the fire and watch it burn.

Darren Linvill, an associate professor at Clemson University who has been studying the Russian group’s behavior with his colleague Patrick Warren, explained that the trolls “don’t slant toward one side or the other in the NFL flag debate, but they do slant very steeply to both extremes,” he said.

“Kaepernick is either a hero fighting a corrupt system or a villain who has betrayed his country. It’s two very simple, divisive story lines told at the same time with the goal of dividing our country rather than adding nuance to an ongoing, important national conversation.”

The most pernicious aspect of a fragmented media environment in which partisans can “shop” for the realities they want to find is the overwhelming uncertainty that less ideological citizens experience. We no longer know which sources are credible, which advocacy groups we can trust, which “breaking news” items have been vetted and verified.

We don’t know what’s bullshit and what isn’t–and that’s paralyzing.

Comments