ADVERTISEMENT

How Facebook’s Political Unit Enables the Dark Art of Digital Propaganda

After allegations that Facebook is being used for political propaganda, here’s what the Co-Founder has to say. 

How Facebook’s Political Unit Enables the Dark Art of Digital Propaganda

Mark Zuckerberg, chief executive officer and founder of Facebook Inc., speaks during an event (Photographer: David Paul Morris/Bloomberg) 

(Bloomberg) -- Under fire for Facebook Inc.’s role as a platform for political propaganda, co-founder Mark Zuckerberg has punched back, saying his mission is above partisanship. “We hope to give all people a voice and create a platform for all ideas,” Zuckerberg wrote in September after President Donald Trump accused Facebook of bias.

 

Zuckerberg’s social network is a politically agnostic tool for its more than 2 billion users, he has said. But Facebook, it turns out, is no bystander in global politics. What he hasn’t said is that his company actively works with political parties and leaders including those who use the platform to stifle opposition—sometimes with the aid of “troll armies” that spread misinformation and extremist ideologies.

The initiative is run by a little-known Facebook global government and politics team that’s neutral in that it works with nearly anyone seeking or securing power. The unit is led from Washington by Katie Harbath, a former Republican digital strategist who worked on former New York Mayor Rudy Giuliani’s 2008 presidential campaign. Since Facebook hired Harbath three years later, her team has traveled the globe helping political clients use the company’s powerful digital tools.

How Facebook’s Political Unit Enables the Dark Art of Digital Propaganda

In some of the world’s biggest democracies—from India and Brazil to Germany and the U.K.—the unit’s employees have become de facto campaign workers. And once a candidate is elected, the company in some instances goes on to train government employees or provide technical assistance for live streams at official state events.

Even before Facebook was forced to explain its role in U.S. election meddling—portrayed by its executives as a largely passive affair involving Russian-funded ads—the company’s direct and growing role catering to political campaigns raised concerns inside the social media giant.

“It’s not Facebook’s job, in my opinion, to be so close to any election campaign,” said Elizabeth Linder, who started and ran the Facebook politics unit’s Europe, Middle East and Africa efforts until 2016. Linder had originally been excited about the company’s potential to be “extraordinarily useful for the world’s leaders—but also the global citizenry.” She said she decided to leave the company in part because she grew uncomfortable with what she saw as increased emphasis on electioneering and campaigns.

In the U.S., the unit embedded employees in Trump’s campaign. (Hillary Clinton’s camp declined a similar offer.) In India, the company helped develop the online presence of Prime Minister Narendra Modi, who now has more Facebook followers than any other world leader. In the Philippines, it trained the campaign of Rodrigo Duterte, known for encouraging extrajudicial killings, in how to most effectively use the platform. And in Germany it helped the anti-immigrant Alternative for Germany party (AfD) win its first Bundestag seats, according to campaign staff.

How Facebook’s Political Unit Enables the Dark Art of Digital Propaganda

By all accounts, Facebook has been an indispensable tool of civic engagement, with candidates and elected officials from mayor to prime minister using the platform to communicate directly with their constituents, and with grassroots groups like Black Lives Matter relying on it to organize. The company says it offers the same tools and services to all candidates and governments regardless of political affiliation, and even to civil society groups that may have a lesser voice. Facebook says it provides advice on how best to use its tools, not strategic advice about what to say.

“We’re proud to work with the thousands of elected officials around the world who use Facebook as a way to communicate directly with their constituents, interact with voters, and hear about the issues important in their community,” Harbath said in an emailed statement. 

She said the company is investing in artificial intelligence and other ways to better police hate speech and threats. “We take our responsibility to prevent abuse of our platform extremely seriously,” Harbath said. “We know there are ways we can do better, and are constantly working to improve.”

Power and social media converge by design at Facebook. The company has long worked to crush its smaller rival, Twitter, in a race to be the platform of choice for the world’s so-called influencers, whether politicians, cricket stars or Kardashians. Their posts will, in theory, draw followers to Facebook more frequently, resulting in higher traffic for advertisers and better data about what attracts users.

Politicians running for office can be lucrative ad buyers. For those who spend enough, Facebook offers customized services to help them build effective campaigns, the same way it would Unilever NV or Coca-Cola Co. ahead of a product launch.

While Facebook declined to give the size of its politics unit, one executive said it can expand to include hundreds during the peak of an election, drawing in people from the company’s legal, information security and policy teams.

 

At meetings with political campaigns, members of Harbath’s team sit alongside Facebook advertising sales staff who help monetize the often viral attention stirred up by elections and politics. They train politicians and leaders how to set up a campaign page and get it authenticated with a blue verification check mark, how to best use video to engage viewers and how to target ads to critical voting blocs.

How Facebook’s Political Unit Enables the Dark Art of Digital Propaganda

Once those candidates are elected, their relationship with Facebook can help extend the company’s reach into government in meaningful ways, such as being well positioned to push against regulations.

At the very least, the optics of directly aiding campaigns or those in power may create the impression among users that Facebook is taking sides. Its effort effectively helping the Scottish National Party to victory in 2015 is recounted as a “success story” on Facebook’s corporate website that lists business case studies, even though those who favor staying in the U.K. might see it otherwise. In April, Vietnamese officials bragged that Facebook would build a dedicated channel to prioritize takedown requests for content that offended authorities. The company generally routes requests from governments through a separate channel, and takes the content down if it violates community standards. If it violates local law, it’ll only be unavailable in the relevant country.  

“They’re too cozy with power,” said Mark Crispin Miller, a media and culture professor at New York University.

That problem is exacerbated when Facebook’s engine of democracy is deployed in an undemocratic fashion. A November report by Freedom House, a U.S.-based nonprofit that advocates for political and human rights, found that a growing number of countries are “manipulating social media to undermine democracy.” One aspect of that involves “patriotic trolling,” or the use of government-backed harassment and propaganda meant to control the narrative, silence dissidents and consolidate power.

 

Internally, Facebook executives are grappling with how to distinguish between what constitutes trolling harassment and protected political speech. Zuckerberg has long maintained the company doesn’t want to play censor, but Facebook has drawn some lines—banning Greece’s Golden Dawn, the ultranationalist party, for example. The company also often removes the most extreme content, from white nationalists in the U.S. and from the Islamic State, as well as content it catches violating its “community standards” on hate speech and violence. Not all such content gets caught.

In retrospect, the nexus of power and data at Facebook seems inevitable. In 2007, Facebook opened its first office in Washington. The presidential election the following year saw the rise of the world’s first “Facebook President” in Barack Obama, who with the platform’s help was able to reach millions of voters in the weeks before the election. The number of Facebook users surged around the Arab Spring uprisings in the Middle East around 2010 and 2011, demonstrating the broad power of the platform to influence democracy.

How Facebook’s Political Unit Enables the Dark Art of Digital Propaganda

By the time Facebook named Harbath, the former Giuliani aide, to lead its global politics and government unit, elections were becoming major social-media attractions. They now rank alongside the Super Bowl and the Olympics in terms of events that draw blockbuster ad dollars and boost engagement.

Facebook began getting involved in electoral hotspots around the world. They went to Argentina in 2015, where now-President Mauricio Macri streamed campaign rallies live on Facebook and, once elected, announced his entire cabinet on the site, complete with emojis. The same year, Poland’s nationalist president, Andrzej Duda, became one of the first world leaders to livestream his inauguration on the social network. Even as Duda has overseen a crackdown on press freedom in the country, Facebook’s corporate website says the company was “integral” to his electoral success and that his page is “one of his office’s main communication channels.”

Facebook has embedded itself in some of the globe’s most controversial political movements while resisting transparency. Since 2011, it has asked the U.S. Federal Election Commission for blanket exemptions from political advertising disclosure rules that could have helped it avoid the current crisis over Russian ad spending ahead of the 2016 election, Bloomberg reported in October. After a Congressional inquiry into Russian election meddling, Facebook has pledged to be more transparent about ad buyers and said it’s open to regulation.

The company’s relationship with governments remains complicated. Facebook has come under fire in the European Union, including for the spread of Islamic extremism on its network. The company just issued its annual transparency report explaining that it will only provide user data to governments if that request is legally sufficient, and will push back in court if it’s not. Despite Facebook’s desire to eventually operate in China and Zuckerberg’s flirtation with the country’s leaders, it’s still unwilling to compromise as much as the government wants it to in order to enter.

How Facebook’s Political Unit Enables the Dark Art of Digital Propaganda

India is arguably Facebook’s most important market, with the nation recently edging out the U.S. as the company’s biggest. The number of users there is growing twice as fast as in the U.S. And that doesn’t even count the 200 million people who use the company’s WhatsApp messaging service in India, more than anywhere else on the globe.

By the time of India’s 2014 elections, Facebook had for months been working with several campaigns. Modi, who belongs to the nationalist Bharatiya Janata Party, relied heavily on Facebook and WhatsApp to recruit volunteers who in turn spread his message on social media. Since his election, Modi’s Facebook followers have risen to 43 million, almost twice Trump’s count.

Within weeks of Modi’s election, Zuckerberg and Chief Operating Officer Sheryl Sandberg both visited the nation as it was rolling out a critical free internet service that the government later curbed. Harbath and her team have also traveled there, offering a series of workshops and sessions that have trained more than 6,000 government officials.

As Modi’s social media reach grew, his followers increasingly turned to Facebook and WhatsApp to target harassment campaigns against his political rivals. India has become a hotbed for fake news, with one hoax story this year that circulated on WhatsApp leading to two separate mob beatings resulting in seven deaths. The nation has also become an increasingly dangerous place for opposition parties and reporters. In the past year, several journalists critical of the ruling party have been killed. Hindu extremists who back Modi’s party have used social media to issue death threats against Muslims or critics of the government.

On the night of Sept. 5, a Honda motorcycle pulled in front of the Bengaluru home of Gauri Lankesh, an outspoken critic of Modi who had been targeted by patriotic trolls on Facebook and other social media. As the Indian journalist was unlocking her gate, three bullets struck her in the head and chest, killing her. No arrests have been made.

The final editorial Lankesh had written for her newspaper was titled “In the Age of False News.” In it, she lamented how misinformation and propaganda on social media were poisoning the political environment.

 

 

--With assistance from Benjamin Elgin

To contact the authors of this story: Lauren Etter in Austin at letter1@bloomberg.net, Vernon Silver in Rome at vtsilver@bloomberg.net, Sarah Frier in San Francisco at sfrier1@bloomberg.net.

To contact the editors responsible for this story: Flynn McRoberts at fmcroberts1@bloomberg.net, Robert Friedman Robert Blau

©2017 Bloomberg L.P.