ADVERTISEMENT

Facebook Grew Too Big to Care About Privacy

Facebook Grew Too Big to Care About Privacy

(Bloomberg Opinion) -- Two years ago, a Yale Law School student published what became an influential paper about how antitrust law should apply to one of America’s superstar technology companies, which don’t fit the conventional mold of Standard Oil monopolists. 

Now, another academic paper from a former advertising technology executive and Yale law graduate is arguing that Facebook Inc. abuses its power. Titled in part “The Antitrust Case Against Facebook,” its author, Dina Srinivasan, offers a deeply researched analysis of Facebook’s pattern of backtracking on the user data collection that allowed the company to become a star. Once Facebook was powerful and popular, Srinivasan says, it was able to overrun objections about its data-harvesting practices. 

The core of Srinivasan’s argument is to treat two anxieties about Facebook — potential abuses of monopoly power and violations of users’ privacy — not as separate but as two sides of the same coin. It’s a relatively novel idea that has echoes in a recent order from Germany’s antitrust authority. (Facebook has said the German regulator was wrong to link enforcement of privacy law and antitrust, and the company said it is appealing the decision.)

Facebook Grew Too Big to Care About Privacy

The paper was published this week in the Berkeley Business Law Journal from the University of California, and I read a version that has been online for two weeks.

I’ll leave it to legal experts to assess the validity of Srinivasan’s antitrust analysis. I was drawn in by her selective, although not inaccurate, reading of Facebook’s history of increasing boldness in harvesting information for the purpose of tailoring ads.

In Facebook’s early years, Srinivasan says, Facebook competed against once-popular social networks like MySpace in part by pitching itself as protective of people’s privacy.  But once Facebook became an indispensable tool of digital life, it gained the power to reverse promises it had made not to gather certain types of information on people’s online activity.

Her piece is a reminder that Facebook’s practice of gathering digital dossiers for commercial purposes didn’t happen at all once but as a slow creep that overwhelmed — or waited out — objections that Facebook was misleading the public or undermining people’s privacy. This history remains relevant as Facebook is working on linking its multiple internet hangouts in a way that is likely to generate yet more user data for itself.

In one particularly compelling example from Srinivasan, she focuses on Facebook’s “like” and “share” buttons. The company started to introduce the features in 2010, and now they’re on millions of websites. Many Facebook users don’t know it,  but those bits of software code enable the company to collect information as people roam non-Facebook websites — whether or not people click them.

Facebook Grew Too Big to Care About Privacy

Initially, Facebook told partners and the public that it wasn’t tracking people’s web surfing, nor would it use that information for personalized advertising. The company’s initial statements about what information it was collecting and when were not entirely correct, but Srinivasan argues that Facebook was at least cautious about such practices. There were enough competitive social network alternatives that there was blowback each time the company overreached in the types of information it wanted to gather or how it would broadcast people’s actions online.

Then, in 2014, Facebook changed its policy to allow for use of web activity data in ad targeting. The company, Srinivasan writes, “would do precisely what it had spent seven years promising it did not and would not do, and finally accomplished what the previous competitive market had restrained it from doing.”

To Srinivasan, this was part of Facebook’s pattern of bait-and-switch tactics surrounding data harvesting. It grew so popular, she says, the company could eventually change the rules. 

I was struck by the common thread between Facebook’s recent data privacy scandals, Srinivasan’s reading of Facebook history and the conclusions from Germany’s antitrust authority. People wary of Facebook now face an unappealing choice between pervasive surveillance by a powerful company or pulling out of a bedrock tool of modern life. And it didn’t get that way by accident.

Bloomberg is a member of Digital Content Next, the trade association that conducted this survey on Facebook users' expectations.

To contact the editor responsible for this story: Daniel Niemi at dniemi1@bloomberg.net

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

Shira Ovide is a Bloomberg Opinion columnist covering technology. She previously was a reporter for the Wall Street Journal.

©2019 Bloomberg L.P.