Is Apple Really Your Privacy Hero?
(Bloomberg Businessweek) -- Apple Inc. has positioned itself as the champion of privacy. Even as Facebook Inc. and Google track our moves around the internet for advertisers’ benefit, Apple has trumpeted its noble decision to avoid that business model. When Facebook became embroiled in a scandal over data leaked by an app developer, Apple Chief Executive Officer Tim Cook said he wouldn’t ever be in such a situation. He framed Apple’s stance as a moral one. Privacy is a human right, he said. “We never move off of our values,” he told NPR in June.
The campaign is working, as evidenced by media reports depicting Apple as hero to Facebook’s villain. But that marketing coup masks an underlying problem: The world’s most valuable company—its market value crossed the $1 trillion mark on Aug. 2—has some of the same security problems as the other tech giants when it comes to apps. It has, in effect, abdicated responsibility for possible misuse of data, leaving it in the hands of the independent developers who create the products available in its App Store.
Bloomberg News recently reported that for years iPhone app developers have been allowed to store and sell data from users who allow access to their contact lists, which, in addition to phone numbers, may include other people’s photos and home addresses. According to some security experts, the Notes section—where people sometimes list Social Security numbers for their spouses or children or the entry codes for their apartment buildings—is particularly sensitive. In July, Apple added a rule to its contract with app makers banning the storage and sale of such data. It was done with little fanfare, probably because it won’t make much of a difference.
When developers get our information, and that of the acquaintances in our contacts list, it’s theirs to use and move around unseen by Apple. It can be sold to data brokers, shared with political campaigns, or posted on the internet. The new rule forbids that, but Apple does nothing to make it technically difficult for developers to harvest the information.
This is the kind of situation that landed Facebook CEO Mark Zuckerberg in 10 hours of congressional testimony in April. In that case, the maker of a personality quiz app gathered the profile information not only of Facebook users, but also of those users’ friends, then shared it with Cambridge Analytica, the consulting company that worked to help elect Donald Trump. As many as 87 million people were affected, even though only 270,000 used the quiz app. Senators interrogated Zuckerberg about why the company didn’t have a means of knowing where the data went. “Once it’s out of our system, it is a lot harder for us to have a full understanding of what’s happening,” he said.
Apple has the ingredients for a Cambridge Analytica-type blowup, but it’s successfully convinced the public that it has its users’ best interests at heart with its existing, unenforceable policies. Indeed, Bloomberg’s report about the app makers’ data access elicited positive commentary from lawmakers and privacy advocates about the potential benefit of Apple’s rule—and little mention of the 10 years of minimal oversight that preceded it. The office of Democratic Senator Mark Warner of Virginia, one of Facebook’s loudest congressional critics, said Cook and his company “should be applauded—for this and for other user-empowering moves Apple has made that will give consumers better control over how their data is used.”
But Apple doesn’t have control.
The company’s main argument for why it’s a better steward of customers’ privacy is that it has no interest in collecting personal data across its browser or developer network. It simply doesn’t need to, because it doesn’t make its money off advertising. The public wholeheartedly agrees with this “hear no evil, see no evil” strategy because of popular discomfort over the quiet surveillance of private online habits by all the other multibillion-dollar corporations.
Apple’s argument holds when it comes to tracking phone messages or the articles users read. Certain data are indeed safer from third parties when stored on a device. But when it comes to the app developer network, that’s like a parent—in this case, Apple—claiming the developer kids are well-supervised. They’re not. Once Apple reviews and approves independent apps, it can’t see how the data they collect is used.
Apple didn’t respond when asked whether it’s banned any apps while enforcing its new policy. “App Store rules have always been selectively enforced,” says Joseph Jerome, a policy counsel on the privacy and data project of the Center for Democracy & Technology, a Washington-based consumer advocacy group focused on tech policy. Apple could always find some developers to make an example of, he says. (In 2012, Cook lectured the CEO of Path, a popular social app at the time, for creating address book databases to help app users find their friends quickly but without getting the permission of those friends.) Apple could also threaten to conduct audits. But it can’t guarantee that independent developer apps use data responsibly. “To even see how developers use the data would be really, really tough,” Jerome says.
The iOS app developer network is much more robust and influential than is Facebook’s. The programmers have produced everything customers have used on their iPhones in the past 10 years that Apple hasn’t made itself. Developers have made $100 billion in revenue in that time, even after Apple took its average 30 percent cut. Fortunes have been built on the personal data of Apple customers.
That information helps game, money transfer, and chat apps. It’s useful when advertising to people who know each other but don’t have the same app yet. The data can have creepy uses, too. Facebook told lawmakers that users’ data are fed into its “People You May Know” feature, which shows people with whom they might want to be friends. According to the tech blog Gizmodo, a man who made a private arrangement with a couple to donate sperm was years later prompted to add the child as a friend on Facebook. He’d never seen his biological daughter, but he was still in touch with the couple, so Facebook may have connected him via contact information.
Apple has built in two direct consumer controls: one, when you agree to share your contact information with the developer; and the other, when you toggle the switch in your settings to deny that permission. But neither is as simple as it seems. The first gives developers access to everything you’ve stored about everyone you know, more than just their phone numbers, and without their permission. The second is deceptive. Turning off sharing only blocks the developer from continued access—it doesn’t delete data already collected.
Google’s Android phones have the same problem. On a consumer help page, the company says removing developers’ access to contacts doesn’t remove the information they already have. But Google hasn’t built its public profile on promises of being a superior steward of our data.
Cook received a letter from the House Energy and Commerce Committee in July asking questions about how Apple handles consumer data. The committee said it was reviewing business practices that may “impact the privacy expectations of Americans.” Partly in response to the Bloomberg report, it asked, “Could Apple control or limit the data collected by third-party apps available on the App Store?”
On Aug. 7, Apple responded with a multiple-page document that included this statement: “Apple does not and cannot monitor what developers do with customer data they have collected, or prevent the onward transfer of that data, nor do we have the ability to ensure a developer’s compliance with their own privacy policies or local law. The relationship between the app developer and the user is direct, and it is the developer’s obligation to collect and use data responsibly.”
If Apple wants to truly be an advocate for consumer privacy, it could take the lead in building a better system—one that lets its customers more directly control who has their data. Companies don’t go out of their way to give users deeper control over their contact lists because it’s not beneficial to the bottom line, says Jennifer King, director of consumer privacy at Stanford’s Center for Internet and Society. “Nobody has really reimagined the address book since we made them electronic in the ’90s,” she says. “It’s just a phone book, and there’s no way to lock down information or privilege certain types.”
Developers have access to dozens of different data points they can ingest whenever a user says yes. So the first step is obvious: Restrict them from getting any information from users’ lists beyond phone numbers and email addresses. The next step is redesigning the controls of the list to allow users to encrypt or decline to share certain contacts. The names in a contact list could be benign, or they could be revealing—a doctor’s patients, a dealmaker’s network, a journalist’s sources. “Any high-net-worth or high-power individual wouldn’t give over their most sensitive contacts to a stranger,” King says. “Why can’t we lock that down?”
These are only steps—not a complete solution. That would require Apple to know who we are and follow our data around the internet. For all of Facebook’s privacy problems, it was at least able to alert people who were potentially affected by the Cambridge Analytica leak. Apple has no such mechanism. If the company insists on not knowing what happens to our data in the name of privacy, it can at least help us ensure we don’t share more of it than necessary.
To contact the editor responsible for this story: Howard Chua-Eoan at firstname.lastname@example.org
©2018 Bloomberg L.P.