ADVERTISEMENT

Here's How Facebook and Google Dodge EU Data Rules

The tech companies have found ways to discourage users from signing on to new privacy protections.

Here's How Facebook and Google Dodge EU Data Rules
Privacy setting shortcuts are displayed on Apple Inc. iPhone 6 smartphone screen as a FaceBook Inc. logo is seen in this arranged photograph taken in London, U.K. (Photographer: Chris Ratcliffe/Bloomberg)

(Bloomberg Opinion) -- It’s becoming standard practice for U.S. tech giants to follow the letter of European rulings and regulations without really changing their behavior. Most recently, Facebook and Google have exhibited just a superficial compliance with the EU’s General Data Protection Regulation, which requires companies to allow users to keep control of their data.

The government-funded Norwegian Consumer Council issued a report showing that the tech companies’ rely on “dark patterns” to discourage users from exercising their privacy rights. The designation refers to interfaces intended to trick users into doing something, usually subscribing to a service they don’t want or giving up data. Facebook and Google have used this strategy for some time, even as they superficially adhered to the European rules known as GDPR. The report by Norway’s agency for consumer protection details the tricks the companies use to create the illusion of compliance.

The researchers found that Facebook and Google have default settings designed to extract a maximum of personal data from users. Their GDPR-related notifications are adorned with a big, convenient button for consumers to accept the company’s current practices. If the user declines, he or she is invited to change the settings. In effect, the system makes opting in the default response, while opting out is a multi-step process that dissuades users.

“From an ethical point of view, we think that service providers should let users choose how personal data is used to serve tailored ads or experiences,” the report says. “Defaulting to the least privacy friendly option is therefore unethical in our opinion, regardless of what the service provider considers legitimate interest.”

The consumer watchdog says Microsoft has done a better job of making it possible for consumers to protect their data. Windows offers a series of screens that invite users to make an in or out choice. Its Windows 10 update had about the same number of steps for opting in as for opting out. The Norwegian researchers hold up that design as an example of transparency.

Facebook and Google don’t just default to the most privacy-intrusive settings. They also make changing them unattractive and cumbersome. The dissuasion begins with the color of the buttons: Facebook’s “Accept and continue” option is a reassuring blue. “Manage data settings” is gray and appears less desirable. Besides, the wording signals that a string of hard-to-understand and time-consuming choices lie ahead for those who select “Manage data settings.” The subliminal warning is apt: It takes 13 clicks to opt out of authorizing data collection; opting in can be done with a single click. Google’s approach is similar. And in both cases, it is even more difficult to delete data that has already been collected, for example about location history, which requires clicking through 30 to 40 pages.

“By giving users an overwhelming amount of granular choices to micromanage, Google has designed a privacy dashboard that, according to our analysis, actually discourages users from changing or taking control of the settings or delete bulks of data,” the researchers wrote.

The report also criticized the language Facebook and Google use to push consumers to accept data collection. Here’s Facebook’s pitch for its intrusive face-recognition feature: “If you keep face recognition turned off, we won’t be able to use this technology if a stranger uses your photo to impersonate you.”

But the notice doesn’t mention the limitations that “are in place on how Facebook may use this information,” the researchers wrote. “For example, the use of face recognition could be used for targeted advertising based on emotional states, or to identify users in situations where they would prefer to remain anonymous.”

Google, too, doesn’t warn about the negative side of ad personalization (such as the potential for being selected for unbalanced political messaging): it just told users they would still see ads but “less useful” ones.

When it came to wording, Microsoft, too, was found lacking. Its interface describes the less privacy-friendly options in glowing terms, such as “Improve inking and typing recognition,” and framed them as positive decisions. The more restrictive choices were presented as negative. 

The tech giants don’t limit themselves to wording and design nudges; they reward behavior that suits their goals and threaten punishment for users who want to take a different path. Take Facebook’s “see your options” button, which appears for those who don’t want to hit “I accept.” It brings up a screen with two options: going back to accept the terms or deleting the account. Google tells users who opt out of ad personalization that they won’t be able to “block or mute some ads.”

Microsoft, however, tells users that Windows devices would operate normally and be equally secure regardless of their privacy choices.

The efforts to obfuscate are best illustrated by the report’s flowcharts tracking the user’s path from the GDPR popup. Here’s Google’s:

Here's How Facebook and Google Dodge EU Data Rules

Interface design and wording tricks may seem like innocent manipulations that don’t detract from the tech companies’ overall compliance with European privacy rules. But the GDPR requires “privacy by design.” What users get instead is intrusion and data harvesting by design and a limited amount of privacy if one is willing to allocate time and overcome confusion. 

Privacy advocates such as the Austrian lawyer Max Schrems have filed complaints with national regulators about GDPR compliance. There undoubtedly will be more objections, which will eventually lead to lawsuits. But that approach creates the illusion of recourse while forcing people to spend time and money fighting for rights that are granted them by EU law. 

A more honorable solution was available, as Microsoft’s example shows. Of course, the software maker’s business model, unlike Facebook’s and Google’s, isn’t based on data collection. That, however, doesn’t mean the others should have a license to pretend that they follow the rules without actually doing so. European regulators should come down hard on their practices.

To contact the editor responsible for this story: Max Berley at mberley@bloomberg.net

©2018 Bloomberg L.P.