ADVERTISEMENT

Facebook Assesses Country Risks for Decisions on Content Removal

Facebook Assesses Country Risks for Decisions on Content Removal

Facebook Inc. said it’s developed a strategy since 2018 to monitor and remove content that violates its policies, especially in countries most at risk of offline violence.

The factors used for such evaluations include social tensions and civic participation, as well as how the use of its social media tools affect that country, it said, citing elections in Myanmar, Ethiopia, India and Mexico as recent examples. It also considers how the information may shed light on a current problem, such as crime, elections, violence and Covid-19 transmission and vaccination rates, it added. 

“This allows us to act quickly to remove content that violates our policies and take other protective measures,” according to a Facebook blog Saturday by Miranda Sissons, director of human rights policy, and Nicole Isaac, international strategic response director. “We know that we face a number of challenges with this work and it is a complex and often adversarial space — there is no one-size-fits-all solution.”   

As rioters breached barricades and bludgeoned police with flagpoles before storming the U.S. Capitol on Jan. 6, some Facebook employees took to an internal discussion board to express shock and outrage. Many of the posts were imbued with a dawning sense that they and their employer -- whose platforms for weeks had spread content questioning the legitimacy of the election -- bore part of the blame. 

©2021 Bloomberg L.P.