ADVERTISEMENT

Facebook’s Fake News Problem Is Way Bigger Than Hoaxes

Experts question whether new techniques are being missed.  

Facebook’s Fake News Problem Is Way Bigger Than Hoaxes
A woman checks Facebook on her smartphone in London, U.K. (Photographer: Chris Ratcliffe/Bloomberg)

(Bloomberg) -- In February, the YourNewsWire page on Facebook was at its peak popularity, boosted by its salacious post claiming that Justin Trudeau, the Canadian prime minister, was Fidel Castro’s love child.

That story, shared 1,800 times, represented exactly the kind of content Facebook Inc. had promised to clean up on its site -- and which YourNewsWire prolifically produced. It was created to drive attention, and therefore advertising revenue. It was also provably false.

Facebook’s Fake News Problem Is Way Bigger Than Hoaxes

YourNewsWire is still publishing. But its stories aren’t going viral on Facebook anymore, and the website is finding it more difficult to make money. In the case of YourNewsWire, at least, Facebook has delivered on its promises in time for the U.S. midterm elections on Tuesday.

“In many ways, they’re an ideal test example in at least the limited scope of what Facebook said they wanted to do -- to see blatantly false news debunked, and reduce its reach,’’ said Alexios Mantzarlis, who leads the International Fact-Checking Network at the Poynter Institute.

It’s been more than two years since the 2016 election conversation was muddied with viral false information, like the report that Donald Trump had been endorsed by the pope. While Facebook has admitted some responsibility for the spread of fake news, its road to fixing the problem has been slow. The company decided it would limit its efforts to stories that were provably false, and it wouldn’t do so directly. Instead, Facebook works with third-party fact-checkers, including PolitiFact and Snopes. In interviews, fact-checkers said often they only have enough staff members to address a few stories a week, sometimes long after they’ve gone viral. When stories are debunked, Facebook reduces their reach.

This election is unlikely to see a story go viral at the level of the fake pope endorsement article. Absent more data, it’s hard to know how much to attribute that progress to Facebook. Those who study the internet’s worst offenders say they aren’t resonating as much as they have in the past. It may be that readers are wiser, said Brooke Binkowski, managing editor at TruthOrFiction.com.

“Readers have become more savvy,” said Binkowski, who used to work at Snopes, the Facebook partner. “They understand that fake news is a problem, and they’ve become more vigilant.’’

Facebook built a system that specifically addresses hoax news websites and pages. But that shifted some of the fake news activity to posts and images that go viral in Facebook groups, in which old photos are often doctored or retitled to apply to current news events. Facebook groups have helped spread misinformation about a caravan of immigrants walking on foot to the U.S. border -- falsely labeling the group as violent or diseased, in a fear campaign that has bubbled up to President Trump.

“Groups have become the preferred base for coordinated information influence activities on the platform,” not the Facebook Pages that were active ahead of the 2016 election, said Jonathan Albright, director of the Digital Forensics Initiative at Columbia’s Tow Center for Digital Journalism, in a recent report. “It is Facebook’s Groups -- right here, right now  --  that I feel represents the greatest short-term threat to election news and information integrity."

If hoax publishers aren’t as much of a problem in the U.S., polarization still is. Publishers on the far right or far left -- who don’t publish fake news so much as news in a skewed context, meant to alarm readers -- still thrive on Facebook. The social network has been asking users to rate publishers by trustworthiness, and baking the scores into its algorithm to address the issue. Still, hyperpartisan news thrives.

A Facebook spokeswoman said the company is aware that photo and video misinformation has become more common, and this year started enabling fact-checkers to address it, too. Facebook also touted three recent studies from Stanford University, the University of Michigan and the French newspaper Le Monde that concluded the magnitude of misinformation has declined on the social network.

“It’s challenging to prove we’re making progress on this because of lack of consensus on what ‘false news’ means,” a Facebook spokeswoman said in a statement. “But we know that this is a highly adversarial space and we have more work to do.”

YourNewsWire, based in Southern California, built its reputation on conspiracy theories, claiming public figures are pedophiles or that vaccines kill, and became one of the top broadcasters of blatant misinformation. Content with no basis in fact is harder to fully disprove, Mantzarlis said.

“You can’t go to every middle-aged woman in America and ask them if they’re a body double for Hillary Clinton,” he said.

But YourNewsWire looks like a regular news site. At the top of the page, its tagline is “News. Truth. Unfiltered.” In some ways, that has made their false content more dangerous than that of Alex Jones, the propaganda artist behind the boisterous brand InfoWars, according to Aaron Sharockman, the executive director of PolitiFact. Jones was banned from Facebook earlier this year after public outrage over his content, especially over his perpetual claim that the Sandy Hook elementary school shooting didn’t occur.

Sharockman said every time a PolitiFact fact-checker finds a YourNewsWire story to be erroneous, the reporter calls to inform the website. Since Facebook’s rules about down-ranking content went into place, YourNewsWire started to delete the posts, in an effort to avoid the consequence.

A Facebook spokeswoman said the company is aware that some publishers think that tactic is a workaround, and is soliciting feedback from fact-checking partners to make its policies more clear. Deleting a post isn’t enough to eliminate a “strike” against the page, Facebook said.

Sean Adl-Tabatabai, YourNewsWire’s editor-in-chief, denied that the site uses that strategy to protect itself against fact-checking, but has been in touch with Facebook about fact-checkers he thinks are overeager to nitpick his stories. He said Facebook listens, but tells him to take up his complaints with the fact-checkers directly.

“I would say overall the idea that there are third-party fact-checkers deciding what people can and cannot see on Facebook is problematic,” Adl-Tabatabai said. “The fact-checkers have been given more and more leeway and what can I do? If they remove me from the public square and put me in the digital gulag, what can I do?”

Facebook said that while sites are able to appeal the conclusions of fact-checkers, the partners are all following Poynter’s fact-checking principles.

YourNewsWire hasn’t been removed from the platform. But some of its partners have distanced themselves. Revcontent, which used to serve ads on YourNewsWire’s site, started in August to remove promotions and withhold revenue from any stories that outside fact-checkers had debunked. Starting Oct. 19, after repeated violations, Revcontent removed YourNewsWire’s advertising support entirely, making it the first site to face that consequence for misinformation reasons, according to Charlie Terenzio, Revcontent’s brand manager. Google, which had in the past served ads on the site, has not for a few years, according to a person familiar with the matter.

On Saturday, the domain YourNewsWire.com started to reroute to newspunch.com. News Punch has a different tagline: “Where mainstream fears to tread.” Sinclair Treadway, who runs the site with Adl-Tabatabai, said the site had to rebrand after its reputation and money-making abilities were affected by negative publicity and Facebook’s fact-checking program.

At Facebook’s Menlo Park, California, headquarters this week, U.S. election activity will be monitored from a War Room with constant dashboards to keep tabs on what’s going viral. Facebook said it’s in contact with secretaries of state and state election bureaus to combat reports on any fake news on the site that could suppress voting.

Meanwhile, the company has expanded its fact-checking network to 14 other countries, where it works with local-language news organizations and fact-checkers. The program doesn’t extend to WhatsApp, the company’s popular messaging app that’s encrypted, where there’s no visibility into which stories are going viral. The app was a major vehicle for disinformation in a recent Brazilian election. Viral content has also helped fuel lynchings in India and ethnic warfare in Myanmar, which Facebook has only started to address this year.

The episodes demonstrate that although Facebook may have tamped down on hoax news in the U.S., bigger problems remain, driven by the viral nature of shocking content.

“I am not convinced that Facebook’s fact-checking program has ever worked,’’ Binkowski said.

--With assistance from Benjamin Elgin.

To contact the reporter on this story: Sarah Frier in San Francisco at sfrier1@bloomberg.net

To contact the editors responsible for this story: Jillian Ward at jward56@bloomberg.net, Andrew Pollack, Molly Schuetz

©2018 Bloomberg L.P.