Tag Archives: propaganda

Brave New World

Okay, I am now officially worried. Really worried.

A few days ago, The Guardian reported on a recent conference of internet hackers, held in Las Vegas. (Yes, even hackers evidently have conferences….)

Using “psychographic” profiles of individual voters generated from publicly stated interests really does work, according to new research presented at the Def Con hacking conference in Las Vegas, Nevada.

The controversial practice allows groups to hone their messages to match the personality types of their targets during political campaigning, and is being used by firms including Cambridge Analytica and AggregateIQ to better target voters with political advertising with so-called “dark ads”.

Most of us don’t consider ourselves targets for “dark ads” aka propaganda. We like to believe that we are different–that we’re thoughtful consumers of information, people who can “smell a rat” or otherwise detect spin and disinformation. We shake our heads over reports like the one about the gullible 28-year-old who shot up a Washington Pizza Parlor because stories on social media and conservative websites had convinced him that Hillary Clinton was operating a Satanic child sex ring out of its (nonexistent) basement.

News flash: we are all more gullible than we like to believe. Confirmation bias is built in to human DNA.

Psychographic profiling classifies people into personality types using data from social networks such as Facebook. Sumner’s research focused on replicating some of the key findings of psychographic research by crafting adverts specifically targeted at certain personality types. Using publicly available data to ensure that the adverts were seen by the right people at the right time, Sumner tested how effective such targeting can be.

The referenced study used information that Facebook already generates about those who use its platform, and created two groups: one composed of “high-authoritarian” conservatives, and a “low-authoritarian” group of liberals.

Knowing the psychographic profiles of the two groups is more useful than simply being able to accurately guess what positions they already hold; it can also be used to craft messages to specifically target those groups, to more effectively shift their opinions. Sumner created four such adverts, two aimed at increasing support for internet surveillance and two aimed at decreasing it, each targeted to a low or high authoritarian group.

For example, the highly authoritarian group’s anti-surveillance advert used the slogan “They fought for your freedom. Don’t give it away!”, over an image of the D-Day landings, while the low authoritarian group’s pro-surveillance message was “Crime doesn’t stop where the internet starts: say YES to state surveillance”.

Sure enough, the targeted adverts did significantly better. The high-authoritarian group was significantly more likely to share a promoted post aimed at them than a similar one aimed at their opposites, while the low authoritarian group ranked the advert aimed at them as considerably more persuasive than the advert that wasn’t.

Think about the implications of this. Political campaigns can now target different messages to different groups far more efficiently and effectively than they could when the only mechanisms available were direct mail campaigns or placement of television ads. As the article noted, this technology allows politicians to appeal to the worst side of voters in an almost undiscoverable manner.

The importance of motivating and turning out your base is a “given” in electoral politics, and these new tools are undoubtedly already in use–further eroding the democratic ideal in which votes are cast after citizens weigh information provided through public policy debates conducted by honorable candidates using verifiable facts.

Thanks to gerrymandering, most of us don’t have genuine choices for Congress or our state legislatures on election day. Now, thanks to technology, we won’t be able to tell the difference between facts and “alternative facts.”

About That Echo Chamber…

As this blog frequently notes, one of the thorniest problems bedeviling our unravelling democracy is the distortion of reality–intentional and unintentional– provided via the Internet. That distortion is immensely aided by our tendency to live in echo chambers populated by friends who think like we do.

Most of us trust links from friends – a vulnerability exploited by phishing sites and other forms of online manipulation. An increasing number of us “unfriend” contacts who post uncongenial opinions or facts inconsistent with our political prejudices.

This is a real problem.

On the one hand, citizens who occupy different realities cannot have productive conversations or negotiate practical solutions to common problems; on the other hand, censorship of electronic media, in an effort to separate wheat from chaff, is neither wise nor possible.

Can technology save us?

Most of us, whatever our political orientation, recognize the problem. As an IU Professor of Computer Science and Informatics puts it,

If you get your news from social media, as most Americans do, you are exposed to a daily dose of hoaxes, rumors, conspiracy theories and misleading news. When it’s all mixed in with reliable information from honest sources, the truth can be very hard to discern.

In fact, my research team’s analysis of data from Columbia University’s Emergent rumor tracker suggests that this misinformation is just as likely to go viral as reliable information.

As he notes, the Internet has spawned an entire industry of fake news and digital misinformation.

Clickbait sites manufacture hoaxes to make money from ads, while so-called hyperpartisan sites publish and spread rumors and conspiracy theories to influence public opinion….

This industry is bolstered by how easy it is to create social bots, fake accounts controlled by software that look like real people and therefore can have real influence. Research in my lab uncovered many examples of fake grassroots campaigns, also called political astroturfing.

In response, we developed the BotOrNot tool to detect social bots. It’s not perfect, but accurate enough to uncover persuasion campaigns in the Brexit and antivax movements. Using BotOrNot, our colleagues found that a large portion of online chatter about the 2016 elections was generated by bots.

The real question–as the author readily concedes–is how to combat technology that spreads propaganda, or “fake news.” As he says, the first step is to analyze how these sites are operating.  Then we can hope that smart people adept in use of these technologies can devise tools to combat the spread of false and misleading information.

Long-term, however, “fixing” the problem of fake news will require fixing the humans who have a need to believe whatever it is that such “news” is peddling. That fix will necessarily begin with better civic education and news literacy, but it can’t end there.

Ultimately, we have a problem of political psychology…It would seem that we humans have invented tools that have outstripped our ability to properly use them.

Worldviews Black and White

On Sunday, the Washington Post had an article tracing the influence of what it called “shadow charities” on shaping the political climate that led to the election of Donald Trump. It focused upon the career of

David Horowitz, a former ’60s radical who became an intellectual godfather to the far right through his writings and his work at a charity, the David Horowitz Freedom Center. Since its formation in 1988, the Freedom Center has helped cultivate a generation of political warriors seeking to upend the Washington establishment. These warriors include some of the most powerful and influential figures in the Trump administration: Attorney General Sessions, senior policy adviser Miller and White House chief strategist Stephen K. Bannon.

The article raised several issues, including the blurred line between actual charities and the current IRS definition of not-for-profit organizations entitled to tax exempt status. That issue is important; taxpayers are subsidizing nonprofit “educational” activities that are more accurately described as promoting political propaganda.

That said, absent a wholesale revision of the tax code and a considerable reduction in the categories we deem eligible for tax-exempt status, this will not be an easy problem to fix. My version of propaganda is likely to be very different from, say, Mike Pence’s.

What was particularly interesting to me was the description of Horowitz, and his trajectory from far left to the even farther right.

Horowitz was a “red diaper baby” of communist parents in New York City. After attending Columbia University in the 1950s, he enrolled as a graduate student at the University of California at Berkeley, an anchor of leftist thinking.

Over the next two decades, he took on prominent roles in the New Left. He served as an editor of Ramparts, an influential muckraking magazine in San Francisco.

But by the late 1970s, he had decided that the left represented a profound threat to the United States. On March 17, 1985, he and a writing partner came out as conservatives in a surprising Washington Post Magazine article headlined “Lefties for Reagan.”

In August 1988, Horowitz launched the Center for the Study of Popular Culture in Los Angeles, a nonprofit group that would become the Freedom Center.

We all know literary and political figures who have made the journey from Left to Right, or Right to Left. Horowitz reminds me of a relative of mine who was a pontificating “Young Socialist” in college, to the great consternation of his much more conservative family; when I ran into him many years later, he was an equally rabid and doctrinaire right-winger.

I have come to realize that most of these “conversions” have very little to do with the content of the political philosophies involved. These are not people who have mellowed with age and softened formerly rigid worldviews. For whatever reason, they have “swapped” Certainty A for Certainty B. We live in a complicated world, where “right” and “wrong” are often ambiguous, and bright lines are hard to come by. For many people, that moral ambiguity is intolerable. They need certainty. They need to be able to distinguish the good guys from the bad guys.

And they desperately need to believe that they are with the “good guys.”

We see much of the same phenomenon in our churches, synagogues and mosques: there are members who value their congregations for the warmth of community, who listen to sermons for illumination into life’s “big questions” and for the insights and guidance offered by their particular doctrines. There are other members who see those doctrines as literal commands from On High, as blackletter law removed from any historical context or nuanced interpretation.

Some people have a psychological need to hold tight to dogma–whether Left or Right, political or religious–in order to function. They need a world that is reliably black and white, where  rules are clear and unambiguous, and where good guys and bad guys are easily identified.

The messy uncertainties and complexities of modern life are challenging to all of us. Accepting a doctrine that purports to explain what is otherwise confusing and threatening–a doctrine that identifies friends and enemies– is a huge temptation.

It’s a temptation we need to resist.

 

It’s Much More Than Just Fake News

This is the time of year when my students–graduate and undergraduate–present the results of their research projects to their classmates (and, of course, me). One of my better undergraduate students focused upon the legal implications of the increasing use of household “personal assistants”–those sort of “Siri for home use” voice-activated electronic devices like Amazon’s “Echo.”

In addition to detailing the investigative uses of such devices by law enforcement, he pointed out potentials for informational mischief, especially when those devices are asked to conduct a search; unlike a google search performed on a computer screen, which yields pages of results and thus highlights inconsistent responses and the questionable credibility of certain of those responses, a virtual assistant simply responds with whatever information has been moved up in the response list by someone good at search engine optimization.

His example: responding to question “who won the popular vote,” one personal assistant read from a single (conspiracy) site reporting that Trump had actually won the popular vote.  No list, no context, no description of the source.

If the implications of his presentation weren’t troubling enough,a report from the Medium website gave me chills.

A data scientist and others had begun digging into so-called “fake news” sites after the election.  It soon became clear to them that they were dealing with a phenomenon that encompassed much more than just a few fake news stories. It was a piece of a much bigger and darker puzzle — a Weaponized AI Propaganda Machine being used to manipulate public opinions and behaviors to advance specific political agendas.

By leveraging automated emotional manipulation alongside swarms of bots, Facebook dark posts, A/B testing, and fake news networks, a company called Cambridge Analytica has activated an invisible machine that preys on the personalities of individual voters to create large shifts in public opinion. Many of these technologies have been used individually to some effect before, but together they make up a nearly impenetrable voter manipulation machine that is quickly becoming the new deciding factor in elections around the world.

Most recently, Analytica helped elect U.S. President Donald Trump, secured a win for the Brexit Leave campaign, and led Ted Cruz’s 2016 campaign surge, shepherding him from the back of the GOP primary pack to the front.

The company is owned and controlled by conservative and alt-right interests that are also deeply entwined in the Trump administration. The Mercer family is both a major owner of Cambridge Analytica and one of Trump’s biggest donors. Steve Bannon, in addition to acting as Trump’s Chief Strategist and a member of the White House Security Council, is a Cambridge Analytica board member. Until recently, Analytica’s CTO was the acting CTO at the Republican National Convention.

Analytica has declined to work on any Democratic campaigns,  and according to the story, is negotiating to help Trump manage both public opinion around his presidency and to expand sales for the Trump Organization.

Cambridge Analytica is now expanding aggressively into U.S. commercial markets and is also meeting with right-wing parties and governments in Europe, Asia, and Latin America….

There’s been a wave of reporting on Cambridge Analytica itself and solid coverage of individual aspects of the machine — bots, fake news, microtargeting — but none so far (that we have seen) that portrays the intense collective power of these technologies or the frightening level of influence they’re likely to have on future elections.

No, They Don’t “All” Do It

Every parent has heard a child respond to a scolding with “Everybody does it.”

When it’s children trying to evade responsibility, we see through that excuse pretty easily. When adults engage in such evasions, when they engage in “false equivalency argumentation,” we seem to be more gullible.

That has been especially true in politics, where complaints about political polarization and generally toxic partisan behaviors are routinely accompanied by rueful statements to the effect that, while reprehensible, “both sides do it.”

They don’t. At least, not with respect to phony “facts.”

A recent major study by the Columbia Journalism Review

shows that political polarization is more common among conservatives than liberals — and that the exaggerations and falsehoods emanating from right-wing media outlets such as Breitbart News have infected mainstream discourse….

The CJR study, by scholars at the Berkman Klein Center for Internet & Society, at Harvard Law School, and the MIT Center for Civic Media, examined more than 1.25 million articles between April 1, 2015, and Election Day. What they found was that Hillary Clinton supporters shared stories from across a relatively broad political spectrum, including center-right sources such as The Wall Street Journal, mainstream news organizations like the Times and the Post, and partisan liberal sites like The Huffington Post and The Daily Beast.

By contrast, Donald Trump supporters clustered around Breitbart — headed until recently by Stephen Bannon, the hard-right nationalist now ensconced in the White House — and a few like-minded websites such as The Daily Caller, Alex Jones’ Infowars, and The Gateway Pundit. Even Fox News was dropped from the favored circle back when it was attacking Trump during the primaries, and only re-entered the fold once it had made its peace with the future president.

Right-wing sites, led by Breitbart, were able to push traditional media outlets into focusing on Trump’s issues, and–even more importantly–able to get them to frame the issues as Trump did. Even more troubling, right-wing sources were able to influence portrayals of Clinton, and to keep the mainstream media focus on her supposed “scandals.”

As the study’s authors noted,

It is a mistake to dismiss these stories as “fake news”; their power stems from a potent mix of verifiable facts (the leaked Podesta emails), familiar repeated falsehoods, paranoid logic, and consistent political orientation within a mutually-reinforcing network of like-minded sites.

Use of disinformation by partisan media sources is neither new nor limited to the right wing, but the insulation of the partisan right-wing media from traditional journalistic media sources, and the vehemence of its attacks on journalism in common cause with a similarly outspoken president, is new and distinctive.

It turns out that the news appetites of liberals and moderates differ from those of the radical right-wing fringe that is today’s Republican base.

What’s at issue here is not just asymmetrical polarization but asymmetrical news consumption. The left and the center avail themselves of real journalism, however flawed it may be, while the right gorges on what is essentially political propaganda — all the while denigrating anything that contradicts their worldview as “fake news.”

It’s a winning business model: tell the paranoid what they want to hear, and assure them that everyone else is lying. That approach made Rush Limbaugh rich, then made Fox News highly profitable, and more recently, evolved into disinformation’s logical conclusion: Breitbart.

But “everyone” doesn’t consume this propaganda. The deficiencies in intellectual honesty on the left pale in comparison to the avid consumption of bullshit that characterizes the rabid right.

They aren’t equivalent.