Facebook Oversight Board Reverses Hate Speech, Nudity Takedowns

Facebook Inc.’s independent oversight board issued its first round of rulings Thursday, overturning four out of five of the company’s decisions to remove posts with controversial content.

“None of these cases had easy answers and deliberations revealed the enormous complexity of the issues involved,” the board said in a statement on its ruling.

Two of the cases concerned hate speech. In one case, Facebook removed a post that included widely shared images of a Syrian toddler who drowned attempting to reach Europe. The user, based in Myanmar, said there was something wrong with Muslims either psychologically or with their mindset. The board said that while the post might be offensive, it didn’t reach the level of hate speech.

In the second case, a post used a Russian word, which Facebook considered a slur, to describe Azerbaijanis. The board upheld Facebook’s decision, confirming that the term is a “dehumanizing slur attacking national origin.”

Other cases the board overturned concerned a post about breast cancer symptoms that was taken down for violating standards on nudity. The board said it was an example of a “lack of proper human oversight.”

Another criticized the lack of health strategy in France and included claims that a cure for COVID-19 exists. Facebook removed the post for violating its policy on misinformation and potential to cause harm during the pandemic. The board found that the user was opposing government policy and attempting to change it.

The final case involved a post that included a quote that was incorrectly attributed to Joseph Goebbels, the Reich Minister of Propaganda in Nazi Germany. Facebook removed the post for violating policies on dangerous individuals and organizations and because the user didn’t make clear that they shared the quote to condemn Goebbels The panel determined that the quote didn’t support the Nazi Party’s ideology or the regime’s acts of hate and violence -- comments on the post from friends supported the user’s claim that they sought to compare Donald Trump’s presidency to the Nazi regime. It also said Facebook needed to be clearer about its policy on support for dangerous individuals.

Beyond the individual case decisions, the board also made suggestions for ways Facebook could change its content moderation systems.

“Some of today’s recommendations include suggestions for major operational and product changes to our content moderation — for example allowing users to appeal content decisions made by AI to a human reviewer,” Facebook said in a blog post response. “We expect it to take longer than 30 days to fully analyze and scope these recommendations.”

Facebook created the board, which is made up of 20 journalists, politicians and judges from around the world, last year in response to criticism of how it handles content. The board began accepting cases in October and in December announced the first six incidents to review. It has received more than 150,000 cases, and since it can’t hear every appeal, the board is “prioritizing cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook’s policies.”

The rulings announced Thursday are binding for Facebook -- it has already restored some of the content -- and come ahead of one of the weightiest decisions the board will make, on Facebook’s decision to suspend President Trump’s account in the wake of the Jan. 6 riots at the U.S. Capitol.

Mark Zuckerberg, Facebook’s co-founder and chief executive officer, has voting control of the company, and he has called the board a check on his power. The company has been under increasing pressure from governments across the globe to fix issues related to hate speech and disinformation.

©2021 Bloomberg L.P.

BQ Install

Bloomberg Quint

Add BloombergQuint App to Home screen.