ADVERTISEMENT

Cass Sunstein on False Rumors, Social Media and Elusive Solutions

Cass Sunstein on False Rumors, Social Media and Elusive Solutions

(Bloomberg View) -- Did you hear? Taylor Swift is doing a new album, consisting of her favorite Katy Perry songs — and despite their lengthy feud, Perry herself will be performing on the album!

OK, that’s not true. But a new study finds that by every measure, false rumors are more likely to spread than true ones. For those who believe in the marketplace of ideas and democratic self-government, that’s a big problem, raising an obvious question: What, if anything, are we going to do about it?

The study, conducted by Soroush Vosoughi, Deb Roy and Sinan Aral of the Massachusetts Institute of Technology, was based on a massive data set, consisting of all fact-checked rumor “cascades” that spread on Twitter from 2006 to 2017. All in all, there were about 126,000 such cascades, spread by about three million people more than 4.5 million times.

To test whether truth was stronger than falsehood, the researchers looked at rumors that had been fact-checked by six independent organizations (Snopes.com, PolitiFact.com, FactCheck.org, TruthOrFiction.com, Hoax-Slayer.com and UrbanLegendsOnline.com). The organizations reached the same conclusion about these rumors at least 95 percent of the time. The central questions: Did falsehoods get retweeted more often? Were they more likely to go viral?

The answers were clear: Yes and yes. Using careful statistical tests, Vosoughi and his co-authors find that “falsehood diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information.”

For example, falsehoods reached 1,500 people six times more rapidly than truth. And while false statements about business, science and entertainment did better than true ones, the biggest difference was in the domain of politics.

Importantly, the researchers found that falsehoods do not spread only or mostly because of the actions of “bots.” Vosoughi and his co-authors re-ran their study while using a bot-detection algorithm to identify and remove all bots -- and they found that all of their main conclusions held. Human beings, it seems, are far more likely to spread falsehood than truth.

The researchers speculate that one reason may be novelty. Enlisting a variety of metrics to test whether tweets convey new information, they find that “false rumors were measurably more novel than true rumors.” It’s reasonable to hypothesize that novel information is more likely to spread, and that hypothesis may help to explain the comparative popularity of falsehoods.

Psychologists have also found that rumors are more likely to spread if they produce identifiable emotions, such as disgust. Vosoughi and his co-authors compared the emotional content of replies to true and false rumors. They found that truth produced greater sadness, trust and anticipation — while falsehoods produced greater surprise and disgust.

These are striking and important findings, but it’s possible to raise some questions. Vosoughi and his colleagues do not really show that falsehoods are more likely to spread than truth. More precisely, they find that within the category of popular rumors tested by independent fact-finding bodies, the false ones are especially likely to spread.

That’s an important distinction, because plenty of falsehoods don’t spread. If I tweeted that the Michigan Law Review is now publishing its 112th volume, that the population of Germany is 85 million, that Carl Yastrzemski won baseball’s Triple Crown in 1969, or that Section 553 of the Administrative Procedure Act governs adjudication, people wouldn’t be all that interested, even though every one of these statements is false.

At the same time, lots of true statements get tons of attention. Consider those involving Robert Mueller’s investigation, or Stormy Daniels’s lawsuit against Donald Trump, or the latest statements and actions of North Korean leader Kim Jong Un.

In short, independent fact-checkers investigate only a very small subset of both false and true statements. Of that subset, the false ones may be especially provocative and interesting, above all in the political domain.

Anticipating this objection, Vosoughi and his co-authors also had their students study a sample of rumor cascades that had not been verified by fact-finding organizations. Their central conclusion held: Rumors found to be false spread more quickly than rumors found to be true. But that study could not possibly explore the full universe of true and false rumors, including the many false rumors that are deadly dull and get little or no attention.

Even with these qualifications, the new research is highly significant, because it shows that demonstrably false rumors receive a great deal of attention on social media. But what’s the best response?

Vosoughi and his co-authors conclude that “misinformation-containment policies” should include “behavioral interventions, like labeling and incentives to dissuade the spread of misinformation.” That might be right, but it’s pretty vague. Suppose that those who run Twitter, Facebook and other social media platforms were determined to reduce the spread of demonstrably false statements, at least when those statements are highly likely to cause serious harm.

It would be useful, if only as a thought experiment, to specify possible responses. Social media platforms could rely on the marketplace of ideas — and do nothing. They could get pretty aggressive — and immediately delete the false statements. They could offer corrections, red flags or vivid warnings. One or another approach would make sense in imaginable contexts. In the coming years, the question deserves sustained attention, with particular focus on what current platforms are, and aren’t, doing right now.

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

Cass R. Sunstein is a Bloomberg View columnist. He is the editor of "Can It Happen Here? Authoritarianism in America" and a co-author of “Nudge: Improving Decisions About Health, Wealth and Happiness.”

To contact the author of this story: Cass R Sunstein at csunstein1@bloomberg.net.

To contact the editor responsible for this story: Katy Roberts at kroberts29@bloomberg.net.

For more columns from Bloomberg View, visit http://www.bloomberg.com/view.

©2018 Bloomberg L.P.