Tag Archives: propaganda

Media, Left and Right

Well, I see that the Republican Governors’ Association has decided to enter the “fake news” sweepstakes. According to reports,

The Republican Governors Association has quietly launched an online publication that looks like a media outlet and is branded as such on social media. The Free Telegraph blares headlines about the virtues of GOP governors, while framing Democrats negatively. It asks readers to sign up for breaking news alerts. It launched in the summer bearing no acknowledgement that it was a product of an official party committee whose sole purpose is to get more Republicans elected.

The website was registered July 7 through Domains By Proxy, a company that allows the originators of a website to shield their identities. […] As of early Monday afternoon, The Free Telegraph’s Twitter account and Facebook page still had no obvious identifiers tying the site to RGA. The site described itself on Twitter as “bringing you the political news that matters outside of Washington.” The Facebook account labeled The Free Telegraph a “Media/News Company.”

Evidently, after the Associated Press made inquiries, the site added a very small, grey box at the bottom of the page, disclosing its origins.

The “mastermind” behind this effort is Wisconsin Governor Scott Walker; he may have been inspired (if that’s the word) by Mike Pence’s ill-fated attempt to establish a state-owned Indiana “news bureau”(aka propaganda site).  Dubbed by critics “Pravda on the Prairie,” it was embarrassingly obvious and ignominiously withdrawn. Walker is evidently better at stealth.

The problem is, this sort of disinformation campaign works–especially with people who want to believe, who want both their own opinions and their own “facts.” As an article in the American Prospect put it,

As we learn more about how Russia used social media as part of its campaign to help elect Donald Trump, what stands out is how easy it was. Spend $100,000 on Facebook ads, create a bunch of Twitter bots, and before you know it you’ve whipped up a fog of disinformation that gives Trump just the boost he needs to get over the finish line. Even if it’s almost impossible to quantify how many votes it might have swayed, it was one of the many factors contributing to the atmosphere of chaos and confusion that helped Trump get elected.

As new as it might seem, this is just the latest manifestation of a broader problem that goes back a long way, one of the degradation of truth, a conservative electorate taught to disbelieve what’s real and accept whatever lunatic things their media figures tell them, and liberals who can’t figure out how to respond.

As the author points out, a liberal version of these mechanisms won’t work. The effect that right-wing media has on its audiences is of a “profoundly different character than what conservative media achieve.”

There’s a doctrinal basis to conservative media that makes it fundamentally different from liberal media, that makes Rush Limbaugh most definitely not the mirror image of a liberal radio host and Sean Hannity not the mirror image of Rachel Maddow. It’s not merely about the conservatives’ and liberals’ respective adherence to truth or penchant for ugly demonization of their opponents, though they differ in that too. It’s that an argument about the larger media world is the foundation of conservative media. Conservative hosts and writers tell their audiences over and over again that nothing they read in the mainstream media can be accepted, that it’s all twisted by a liberal agenda, and therefore they can only believe what conservatives tell them. It’s the driving backbeat to every episode, every story, and every rant.

Liberals complain about media coverage of one story or another all the time. What they don’t do is tell their audiences that any news source that is not explicitly and exclusively devoted to their ideological agenda cannot be trusted. But conservatives do.

The bottom line is that very few of the people who fall within the liberal camp are “good soldiers” in the same way that the Fox News audience is. Liberals still occupy a pretty big tent, and even when they agree on a broad premise–healthcare is a right, for example–they differ significantly on the policies to achieve their goals. As recent research has conclusively shown, conservative and liberal minds work differently.

Which leaves us at the mercy of propaganda. When some people are saying, in effect, “lie to me to reassure me that my tribe is right”–what do we do?

 

Brave New World

Okay, I am now officially worried. Really worried.

A few days ago, The Guardian reported on a recent conference of internet hackers, held in Las Vegas. (Yes, even hackers evidently have conferences….)

Using “psychographic” profiles of individual voters generated from publicly stated interests really does work, according to new research presented at the Def Con hacking conference in Las Vegas, Nevada.

The controversial practice allows groups to hone their messages to match the personality types of their targets during political campaigning, and is being used by firms including Cambridge Analytica and AggregateIQ to better target voters with political advertising with so-called “dark ads”.

Most of us don’t consider ourselves targets for “dark ads” aka propaganda. We like to believe that we are different–that we’re thoughtful consumers of information, people who can “smell a rat” or otherwise detect spin and disinformation. We shake our heads over reports like the one about the gullible 28-year-old who shot up a Washington Pizza Parlor because stories on social media and conservative websites had convinced him that Hillary Clinton was operating a Satanic child sex ring out of its (nonexistent) basement.

News flash: we are all more gullible than we like to believe. Confirmation bias is built in to human DNA.

Psychographic profiling classifies people into personality types using data from social networks such as Facebook. Sumner’s research focused on replicating some of the key findings of psychographic research by crafting adverts specifically targeted at certain personality types. Using publicly available data to ensure that the adverts were seen by the right people at the right time, Sumner tested how effective such targeting can be.

The referenced study used information that Facebook already generates about those who use its platform, and created two groups: one composed of “high-authoritarian” conservatives, and a “low-authoritarian” group of liberals.

Knowing the psychographic profiles of the two groups is more useful than simply being able to accurately guess what positions they already hold; it can also be used to craft messages to specifically target those groups, to more effectively shift their opinions. Sumner created four such adverts, two aimed at increasing support for internet surveillance and two aimed at decreasing it, each targeted to a low or high authoritarian group.

For example, the highly authoritarian group’s anti-surveillance advert used the slogan “They fought for your freedom. Don’t give it away!”, over an image of the D-Day landings, while the low authoritarian group’s pro-surveillance message was “Crime doesn’t stop where the internet starts: say YES to state surveillance”.

Sure enough, the targeted adverts did significantly better. The high-authoritarian group was significantly more likely to share a promoted post aimed at them than a similar one aimed at their opposites, while the low authoritarian group ranked the advert aimed at them as considerably more persuasive than the advert that wasn’t.

Think about the implications of this. Political campaigns can now target different messages to different groups far more efficiently and effectively than they could when the only mechanisms available were direct mail campaigns or placement of television ads. As the article noted, this technology allows politicians to appeal to the worst side of voters in an almost undiscoverable manner.

The importance of motivating and turning out your base is a “given” in electoral politics, and these new tools are undoubtedly already in use–further eroding the democratic ideal in which votes are cast after citizens weigh information provided through public policy debates conducted by honorable candidates using verifiable facts.

Thanks to gerrymandering, most of us don’t have genuine choices for Congress or our state legislatures on election day. Now, thanks to technology, we won’t be able to tell the difference between facts and “alternative facts.”

About That Echo Chamber…

As this blog frequently notes, one of the thorniest problems bedeviling our unravelling democracy is the distortion of reality–intentional and unintentional– provided via the Internet. That distortion is immensely aided by our tendency to live in echo chambers populated by friends who think like we do.

Most of us trust links from friends – a vulnerability exploited by phishing sites and other forms of online manipulation. An increasing number of us “unfriend” contacts who post uncongenial opinions or facts inconsistent with our political prejudices.

This is a real problem.

On the one hand, citizens who occupy different realities cannot have productive conversations or negotiate practical solutions to common problems; on the other hand, censorship of electronic media, in an effort to separate wheat from chaff, is neither wise nor possible.

Can technology save us?

Most of us, whatever our political orientation, recognize the problem. As an IU Professor of Computer Science and Informatics puts it,

If you get your news from social media, as most Americans do, you are exposed to a daily dose of hoaxes, rumors, conspiracy theories and misleading news. When it’s all mixed in with reliable information from honest sources, the truth can be very hard to discern.

In fact, my research team’s analysis of data from Columbia University’s Emergent rumor tracker suggests that this misinformation is just as likely to go viral as reliable information.

As he notes, the Internet has spawned an entire industry of fake news and digital misinformation.

Clickbait sites manufacture hoaxes to make money from ads, while so-called hyperpartisan sites publish and spread rumors and conspiracy theories to influence public opinion….

This industry is bolstered by how easy it is to create social bots, fake accounts controlled by software that look like real people and therefore can have real influence. Research in my lab uncovered many examples of fake grassroots campaigns, also called political astroturfing.

In response, we developed the BotOrNot tool to detect social bots. It’s not perfect, but accurate enough to uncover persuasion campaigns in the Brexit and antivax movements. Using BotOrNot, our colleagues found that a large portion of online chatter about the 2016 elections was generated by bots.

The real question–as the author readily concedes–is how to combat technology that spreads propaganda, or “fake news.” As he says, the first step is to analyze how these sites are operating.  Then we can hope that smart people adept in use of these technologies can devise tools to combat the spread of false and misleading information.

Long-term, however, “fixing” the problem of fake news will require fixing the humans who have a need to believe whatever it is that such “news” is peddling. That fix will necessarily begin with better civic education and news literacy, but it can’t end there.

Ultimately, we have a problem of political psychology…It would seem that we humans have invented tools that have outstripped our ability to properly use them.

Worldviews Black and White

On Sunday, the Washington Post had an article tracing the influence of what it called “shadow charities” on shaping the political climate that led to the election of Donald Trump. It focused upon the career of

David Horowitz, a former ’60s radical who became an intellectual godfather to the far right through his writings and his work at a charity, the David Horowitz Freedom Center. Since its formation in 1988, the Freedom Center has helped cultivate a generation of political warriors seeking to upend the Washington establishment. These warriors include some of the most powerful and influential figures in the Trump administration: Attorney General Sessions, senior policy adviser Miller and White House chief strategist Stephen K. Bannon.

The article raised several issues, including the blurred line between actual charities and the current IRS definition of not-for-profit organizations entitled to tax exempt status. That issue is important; taxpayers are subsidizing nonprofit “educational” activities that are more accurately described as promoting political propaganda.

That said, absent a wholesale revision of the tax code and a considerable reduction in the categories we deem eligible for tax-exempt status, this will not be an easy problem to fix. My version of propaganda is likely to be very different from, say, Mike Pence’s.

What was particularly interesting to me was the description of Horowitz, and his trajectory from far left to the even farther right.

Horowitz was a “red diaper baby” of communist parents in New York City. After attending Columbia University in the 1950s, he enrolled as a graduate student at the University of California at Berkeley, an anchor of leftist thinking.

Over the next two decades, he took on prominent roles in the New Left. He served as an editor of Ramparts, an influential muckraking magazine in San Francisco.

But by the late 1970s, he had decided that the left represented a profound threat to the United States. On March 17, 1985, he and a writing partner came out as conservatives in a surprising Washington Post Magazine article headlined “Lefties for Reagan.”

In August 1988, Horowitz launched the Center for the Study of Popular Culture in Los Angeles, a nonprofit group that would become the Freedom Center.

We all know literary and political figures who have made the journey from Left to Right, or Right to Left. Horowitz reminds me of a relative of mine who was a pontificating “Young Socialist” in college, to the great consternation of his much more conservative family; when I ran into him many years later, he was an equally rabid and doctrinaire right-winger.

I have come to realize that most of these “conversions” have very little to do with the content of the political philosophies involved. These are not people who have mellowed with age and softened formerly rigid worldviews. For whatever reason, they have “swapped” Certainty A for Certainty B. We live in a complicated world, where “right” and “wrong” are often ambiguous, and bright lines are hard to come by. For many people, that moral ambiguity is intolerable. They need certainty. They need to be able to distinguish the good guys from the bad guys.

And they desperately need to believe that they are with the “good guys.”

We see much of the same phenomenon in our churches, synagogues and mosques: there are members who value their congregations for the warmth of community, who listen to sermons for illumination into life’s “big questions” and for the insights and guidance offered by their particular doctrines. There are other members who see those doctrines as literal commands from On High, as blackletter law removed from any historical context or nuanced interpretation.

Some people have a psychological need to hold tight to dogma–whether Left or Right, political or religious–in order to function. They need a world that is reliably black and white, where  rules are clear and unambiguous, and where good guys and bad guys are easily identified.

The messy uncertainties and complexities of modern life are challenging to all of us. Accepting a doctrine that purports to explain what is otherwise confusing and threatening–a doctrine that identifies friends and enemies– is a huge temptation.

It’s a temptation we need to resist.

 

It’s Much More Than Just Fake News

This is the time of year when my students–graduate and undergraduate–present the results of their research projects to their classmates (and, of course, me). One of my better undergraduate students focused upon the legal implications of the increasing use of household “personal assistants”–those sort of “Siri for home use” voice-activated electronic devices like Amazon’s “Echo.”

In addition to detailing the investigative uses of such devices by law enforcement, he pointed out potentials for informational mischief, especially when those devices are asked to conduct a search; unlike a google search performed on a computer screen, which yields pages of results and thus highlights inconsistent responses and the questionable credibility of certain of those responses, a virtual assistant simply responds with whatever information has been moved up in the response list by someone good at search engine optimization.

His example: responding to question “who won the popular vote,” one personal assistant read from a single (conspiracy) site reporting that Trump had actually won the popular vote.  No list, no context, no description of the source.

If the implications of his presentation weren’t troubling enough,a report from the Medium website gave me chills.

A data scientist and others had begun digging into so-called “fake news” sites after the election.  It soon became clear to them that they were dealing with a phenomenon that encompassed much more than just a few fake news stories. It was a piece of a much bigger and darker puzzle — a Weaponized AI Propaganda Machine being used to manipulate public opinions and behaviors to advance specific political agendas.

By leveraging automated emotional manipulation alongside swarms of bots, Facebook dark posts, A/B testing, and fake news networks, a company called Cambridge Analytica has activated an invisible machine that preys on the personalities of individual voters to create large shifts in public opinion. Many of these technologies have been used individually to some effect before, but together they make up a nearly impenetrable voter manipulation machine that is quickly becoming the new deciding factor in elections around the world.

Most recently, Analytica helped elect U.S. President Donald Trump, secured a win for the Brexit Leave campaign, and led Ted Cruz’s 2016 campaign surge, shepherding him from the back of the GOP primary pack to the front.

The company is owned and controlled by conservative and alt-right interests that are also deeply entwined in the Trump administration. The Mercer family is both a major owner of Cambridge Analytica and one of Trump’s biggest donors. Steve Bannon, in addition to acting as Trump’s Chief Strategist and a member of the White House Security Council, is a Cambridge Analytica board member. Until recently, Analytica’s CTO was the acting CTO at the Republican National Convention.

Analytica has declined to work on any Democratic campaigns,  and according to the story, is negotiating to help Trump manage both public opinion around his presidency and to expand sales for the Trump Organization.

Cambridge Analytica is now expanding aggressively into U.S. commercial markets and is also meeting with right-wing parties and governments in Europe, Asia, and Latin America….

There’s been a wave of reporting on Cambridge Analytica itself and solid coverage of individual aspects of the machine — bots, fake news, microtargeting — but none so far (that we have seen) that portrays the intense collective power of these technologies or the frightening level of influence they’re likely to have on future elections.