“Pope Francis shocks world, endorses Donald Trump for president!”
“WikiLeaks confirms Hillary sold weapons to ISIS!”
“Ireland now officially accepting Trump refugees from America!”
In the run-up to the 2016 presidential election, such patently false headlines spread like wildfire across social media, ignited by fake news sites or hyper-partisan blogs.
A few decades ago, such stories would have been shrugged off as satire or dismissed by discriminating journalists. But with the gatekeeping apparatus of mainstream media crumbling, trust in government on the decline and social media platforms providing a vehicle for anything to go viral, research shows such stories not only got distributed, they sometimes receive more clicks than stories in The New York Times.
“We exist in an unprecedented moment of deviant information.”
“In 2016, you started to see the weaponization of social media platforms in ways that I would characterize as dangerous to democracy,” recalled CU’s Toby Hopp, assistant professor of advertising, public relations and media design, who has made a career out of studying what he calls “countermedia” (fake news). “There seemed to be no boundaries anymore in terms of the information being communicated and the responsibility to the truth. It was really concerning.”
Fast-forward to 2020, and the “fake news” phenomenon has become more glaring. Distant “troll factories” — businesses where paid writers churn out fake social media posts intentionally designed to sow discontent among U.S. voters — are thriving in Russia, Macedonia and elsewhere. Conspiracy theories like QAnon — which posits, among other things, that the U. S. government is filled with Satan-worshiping pedophiles — circulate widely, and potentially dangerous misinformation about COVID-19 abounds.
“We exist in an unprecedented moment of deviant information,” said Pat Ferrucci, associate professor of journalism.
To get at the roots of that trend, Ferrucci, Hopp and Chris Vargo, assistant professor of advertising, have spent several years trying to unravel who shares fake news, what makes people click on it and what we can do about it.
“We have found that certain types of people are disproportionately responsible for sharing false, misleading and hyper-partisan information on social media,” said Hopp. “If we can identify those types of users, maybe we can get a grasp on why people do this and design interventions to stem the tide.”
Few Users, Big Reach
The good news: “The reality is, most people do not share fake news,” said Hopp.
In a recent study published in the journal Human Communication Research, the team analyzed posts from 783 regular Facebook and Twitter users between Aug. 1, 2015 and June 6, 2017.
Seventy-one percent of Facebook users and 95% of Twitter users did not share fake news or posts from sites identified by watchdog groups as countermedia.
The bad news: 1,152 pieces of fake news were shared via Facebook, with a single user responsible for 171. On Twitter, users shared 128 pieces of fake news.
“We found that Facebook is the central conduit for the transfer of fake news,” said Hopp.
In the Facebook sample, those who had self-identified as extremely conservative accounted for more than a quarter of all fake news shared. About a third of fake news shared on Twitter was by ultra-conservatives.
Those who self-identified as extremely liberal also played a big role in the spread, accounting for 17.5% of shares on Facebook and 16.4% on Twitter.
“It’s not just Republicans or just Democrats, but rather, people who are — left or right — more ideologically extreme,” said Hopp.
Previous studies have shown that Facebook users 65 and older post seven times as many articles from fake news sites as those under 29 years old, and contrary to popular belief, those who are fairly media literate also spread fake news.
Interestingly, the CU team found those with high levels of trust in their fellow humans are significantly less likely to spread fake news.
“People with high levels of social trust are more likely to compile online social networks comprised of diverse individuals,” said Hopp, noting that the spread of fake news can be slowed when users question a post’s accuracy.
Fear and Anger Drive Clicks
In the study they published in March, Hopp and Vargo examined 2,500 posts crafted and paid for by the infamous Internet Research Agency (IRA), a troll farm in St. Petersburg, Russia, which flooded Facebook with fake content in the run-up to the 2016 election.
According to U.S. government documents, the IRA had been creating fake U.S. personas on social media, setting up fake pages and posts and using targeted advertising to “sow discord” among U.S. residents.
Users flipping through their feeds that fall faced a minefield of incendiary ads, pitting Blacks against police, Southern whites against immigrants, gun owners against Obama supporters and the LGBTQ community against the conservative right — all coming from the same source thousands of miles away.
“This wasn’t necessarily about electing one candidate or another,” said Vargo. “It was essentially a make-Americans-hate-each-other campaign.”
In terms of return on investment, the campaign was remarkably effective.
The IRA spent about $75,000 to garner 41 million impressions reaching 4 million users and generating a 9.2% clickthrough rate — a rate exponentially higher than a typical digital ad.
Ads using inflammatory words (such as “sissy,” “idiot,” “psychopath” and “terrorist”) or that were designed to frighten or anger people did the best.
“The takeaway here was that fear and anger appeals work really well in getting people to engage with content on social media,” said Vargo.
When Everything Is True, Nothing Is
Fake ads and patent falsities aside, Ferrucci stresses that the term “fake news” itself can be misleading.
“When people think of fake news, they think of news that is completely made up from whole cloth. But that is only the tip of the iceberg,” he said.
Countermedia encapsulates a broader array of content, he said: “We believe that the most potentially negative information is that which has a kernel of truth in it but is slanted in a way that is completely deceiving.”
In decades past, he argued, conspiracy theories and deviant information certainly existed in the public sphere, but journalists generally ignored it. On the other end of the spectrum some things were unequivocally agreed upon as true — and free from debate. This left what Ferrucci calls “the sphere of legitimate debate.”
“There is nothing true anymore and everything is subject to debate. That’s the problem.”
The battle against fake news will require a united front, including government, industry, journalists and social media users.
Here We Go Again
On the eve of another election, with a global pandemic raging, the misinformation machine appears to be ratcheting up again.
Public health agencies have warned of a “massive infodemic” amid circulating rumors suggesting that injecting disinfectant or consuming a dietary supplement called colloidal silver can cure COVID-19, or that wearing a mask can somehow boost susceptibility to it.
According to news reports, troll farms in Russia, Macedonia and elsewhere have refined their tactics and are again using social media to try to influence U.S. elections.
Some platforms have taken notice.
This summer, Twitter began adding fact-checking labels to tweets, including some originating from President Donald Trump. It also suspended thousands of accounts associated with QAnon.
Facebook now removes coronavirus news deemed inaccurate and sends a warning to those who have liked or shared it.
Such steps are helpful, Hopp said.
In the end, the battle against fake news will require a united front, including government, industry, journalists and, of course, social media users, the researchers say.
Ferrucci believes reporters should stop giving precious column inches or airtime to conspiracy theories like QAnon and instead focus on the sphere of legitimate debate.
Vargo suggests users become leery of ads and posts scrolling across their feed and look into where they came from — especially those that make your blood boil.
If you see something on social media that you know is false, the researchers agree, don’t be afraid to say so.
“We can disagree here and there about things,” said Hopp, “but when we as a society have fundamentally different views about what is true and what is not, democracy becomes very hard to maintain.”
Illustration by Doug Chayka