A demonstrator wears a mask depicting Facebook Inc. Chief Executive Officer Mark Zuckerberg, center, as he stands with demonstrators wearing angry emoji masks outside the venue of a U.K. parliamentary committee hearing in London, U.K (Photographer: Simon Dawson/Bloomberg)  

Should Twitter, Facebook, and Others Delete Provably Wrong Material?

(Bloomberg Businessweek) -- Social media platforms say it’s not their job to make sure that everything people say on their sites is accurate. “We definitely don’t want to be the arbiter of the truth,” Facebook Inc. Chief Operating Officer Sheryl Sandberg told the BBC in 2017. That position is understandable: Not only would the task of verifying every post be enormous, but censorship would annoy many of the sites’ most active users.

On the other hand, the false information that remains rife on the major sites can have horrifying consequences. In Myanmar, virulent posts by Buddhist nationalists on Facebook last year incited ethnic cleansing of the Rohingya minority. In 2016 a North Carolina man fired rifle shots into a Washington, D.C., pizza restaurant because he believed phony YouTube videos claiming the restaurant harbored a satanic child-sex ring involving Hillary Clinton. The list goes on.

So what’s the solution? A new report by the Center for Business and Human Rights at New York University’s Stern School of Business argues that the platforms should indeed remove material that is “provably untrue,” rather than just demote or annotate it as some do now. Anything that’s in the gray area between true and false could remain, it says. For example, the report says: “A story consistent with the headline ‘The Holocaust Never Happened’ is provably untrue and ought to be removed for that reason. By contrast, a story headlined ‘Democrats Secretly Favor Open Borders’ may be unsubstantiated and misleading, but it isn’t provably false.”

A crackdown on provably untrue material wouldn’t violate the First Amendment right to free speech, the report says, because the First Amendment applies only to government action. A private company such as Facebook, Twitter Inc., or YouTube parent Alphabet Inc. can pick and choose what it displays. “The algorithms they craft do sort billions of posts and tweets, inevitably making choices about what content users see. And at times, applying a combination of software and human judgment, the companies already exclude content or ban users altogether. Our position is that they ought to take the next step and act more vigorously to diminish domestically generated false content,” says the report, written by Paul Barrett, the center’s deputy director, who is a former writer and editor at Bloomberg Businessweek magazine.

Unfortunately, weeding out false material in a fair, impartial way is more easily said than done, as the National Review’s David French writes in a new column, “The Social Media Censorship Dumpster Fire.” One unanticipated problem is that censors employed by the likes of Facebook can become so traumatized by exposure to conspiracy theories that they start believing them. The Verge website, in an article by Casey Newton called “The Trauma Floor,” writes: “One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust. Another former employee, who told me he has mapped every escape route out of his house and sleeps with a gun at his side, said: ‘I no longer believe 9/11 was a terrorist attack.’ ”

Another problem is what to do when a world leader—say, President Donald Trump—posts something that is provably untrue. Twitter addressed this issue in a statement last year. “Blocking a world leader from Twitter or removing their controversial Tweets would hide important information people should be able to see and debate. It would also not silence that leader, but it would certainly hamper necessary discussion around their words and actions,” the company wrote. The center’s report essentially agrees with that, recommending instead that “Twitter needs to consider actively curating the many opposing comments these tweets provoke.” Writes Barrett in an email: “No system is going to be perfect when you’re trying to define and then remove the hoaxes and conspiracy theories.”

To contact the editor responsible for this story: Eric Gelman at egelman3@bloomberg.net

©2019 Bloomberg L.P.