ADVERTISEMENT

Facebook Removes More ISIS Content by Actively Looking for It

Facebook said it was able to remove a larger amount of content from ISIS and al-Qaeda by actively looking for it.

Facebook Removes More ISIS Content by Actively Looking for It
Facebook Inc. logo is reflected in the eyeglasses of a user in this arranged photo in San Francisco, California, U.S. (Photographer: David Paul Morris/Bloomberg)

(Bloomberg) -- Facebook Inc. said it was able to remove a larger amount of content from the Islamic State and al-Qaeda in the first quarter of 2018 by actively looking for it.

The company has trained its review systems -- both humans and computer algorithms -- to seek out posts from terrorist groups. The social network took action on 1.9 million pieces of content from those groups in the first three months of the year, about twice as many as in the previous quarter. And, 99 percent of that content wasn’t reported first by users, but was flagged by the company’s internal systems, Facebook said Monday.

Facebook, like Twitter Inc. and Google’s YouTube, has historically put the onus on its users to flag content that its moderators need to look at. After pressure from governments to recognize its immense power over the spread of terrorist propaganda, Facebook started about a year ago to take more direct responsibility. Chief Executive Officer Mark Zuckerberg earlier this month told Congress that Facebook now believes it has a responsibility over the content on its site.

The company defines terrorists as non-governmental organizations that engage in premeditated acts of violence against people or property to intimidate and achieve a political, religious or ideological aim. That definition includes religious extremists, white supremacists and militant environmental groups. “It’s about whether they use violence to pursue those goals.”

The policy doesn’t apply to governments, Facebook said, because “nation-states may legitimately use violence under certain circumstances.”

Facebook didn’t give any numbers for its takedown of content from white supremacists or other groups it considers to be linked to terrorism, in part because the systems have focused training so far on the Islamic State and al-Qaeda.

Facebook has come under fire for being too passive about extremist content, especially in countries like Myanmar and Sri Lanka where the company’s algorithm, by boosting posts about what’s popular, has helped give rise to conspiracy theories that spark ethnic violence. People in those countries told the New York Times that even after they report content, Facebook may not take it down.

To contact the reporter on this story: Sarah Frier in San Francisco at sfrier1@bloomberg.net.

To contact the editors responsible for this story: Jillian Ward at jward56@bloomberg.net, Molly Schuetz

©2018 Bloomberg L.P.