ADVERTISEMENT

I Want to Surrender to Cambridge Analytica

I Want to Surrender to Cambridge Analytica

(Bloomberg View) -- The theory that a sinister big data firm called Cambridge Analytica (and some associated companies) played a major role in the election of U.S. President Donald Trump and the Brexit vote is remarkably persistent despite some obvious flaws, which I and many others have pointed out. Carole Cadwalladr, whose series of alarmist articles about Cambridge Analytica has triggered a British investigation, is doing serious journalistic work, but she's also helping to create a conspiracy theory through the liberal use of terms such as "manipulating emotions" and "psychological warfare" with rather vague explanations -- though the reality is far less sinister.

Arguing about whether big data and internet-based psychological profiling can swing votes is exciting but rather pointless: Our personal data is being harvested and dissected by politicians and their contractors (as well as commercial marketers) regardless of whether it actually helps them win. Cambridge Analytica and its peers depend on the trail we leave on the internet. Our public posts and browsing histories reveal where we stand in the so-called OCEAN framework, which scores people on Openness to experience, Conscientiousness, Extraversion, Agreeableness and Neuroticism. That sounds like a service Big Brother might offer, but it's important to understand what the terms mean and what these firms can actually do.

Anyone can find out what psychological profilers have on him or her by using a public service offered by IBM. I entered my Bloomberg View columns for one analysis and my Twitter feed (@bershidsky) for another. The results were dramatically different; for example, my columns appear to suggest that I'm far more extroverted and far less conscientious than my Twitter feed betrays. I suspect most people will find that the results of such analysis will wildly oscillate depending on what data are fed into the machine, in keeping with the time-tested "Garbage In, Garbage Out" principle. 

It's hard to cheat a well-designed psychological test. Michal Kosinski, the pioneering researcher in the field of internet-based psychological profiling, collected his data through such a questionnaire. But the quality of analysis slips with data that hasn't been voluntarily submitted. How schizophrenic are you? How different are your online personalities from the "real you" of analog interactions? You may not know that yourself, and there's no way for the researchers to figure that out.

Nor can they really understand what your browsing history -- another oft-harvested dataset -- means. You may be using the same search keywords and visit the same sites to do a school project or seek help for drug addiction. Your child can use your browser to explore the world while you're away. You can behave differently online when drunk and sober. Your Chrome history may surprise you; mine sometimes amazes me.

Targeting political messages to emotional profiles constructed from our digital imprints is an extremely inexact science -- just like Facebook's ad targeting, which is better at freaking people out that at selling them things. That targeting, however, is what conspiracy theorists mean by "emotion manipulation." 

Cambridge Analytica has claimed that it didn't use psychological profiling in the Trump campaign. We do know for certain, though, that the firm used some hard data on voters.

Zurich mathematician Paul-Olivier Dehaye, a privacy activist who investigated the Cambridge Analytica story, has advised a number of people to request their personal data from the firm. In March, David Carroll, a New York designer, posted bits of what he received in response. There were no browsing data and no analysis of social network posts, but there was a detailed voting record and a voting propensity model based on it. Carroll wrote he would have scored himself differently but Cambridge Analytica's assessment felt "roughly accurate." It concluded that Carroll was a likely voter in the 2016 presidential election but a "very unlikely Republican." 

It's questionable that the firm should have had access to this information or been able to process it in the U.K., as it did. To privacy advocates, any use of data people haven't knowingly released is a no-no. But Cambridge Analytica couldn't have "manipulated" Carroll on the basis of the data it had on him. The hardest part of using big data is figuring out which parts of it are relevant and what they actually mean for the task at hand, like getting Trump elected or selling watches (I haven't bought one in years, but for some reason Facebook bombards me with watch ads).

Then comes the next problem -- that of "psychological warfare", better known as propaganda, a set of techniques used, of course, by militaries in wartime -- but also, since time immemorial, by political campaigns and businesses. 

Propaganda, and its fake news subspecies, only works on those consciously susceptible to it -- those whose confirmation biases it strokes -- or, less reliably, on people not engaged enough to exercise skepticism about what they read. Pushing content to the former is preaching to the converted. The latter are unreliable because your competition targets them, too (Hillary Clinton's campaign, like Trump's, used microtargeting in the 2016 election, probably based on similar models to the one Carroll received from Cambridge Analytica).

With their current accuracy, the microtargeters are a Big Brother with Alzheimer's. That creates unsatisfactory user experiences: Irrelevant, irritating ads and, in my case, a barrage of emails asking for money to help the Trump campaign. As the owners of these data, we have two choices: try to stop them from profiling us poorly or encourage them to do a better job.

For those of us who have little to hide, it may make sense to provide more data to the researchers and for them to be a little more open about what they do. If you are committed to this digital lifestyle, of algorithms suggesting products to buy and TV shows to watch and places to go, you might as well get what you pay for. It's not impossible to believe that your quality of life could improve by having robots know you well enough to bring back what you need from the unnavigable jungle that is the internet. My IBM-generated psychological profile says I'm extremely open and curious -- so bring on all the propaganda, no matter whether liberal or conservative, and let me gorge myself. I would genuinely enjoy that.

I would also happily fill out feedback forms that would allow better tailoring to what I receive. Options like "No more watches" or "a little bit more Trump." 

Why treat people as sheep? Mass surveillance can be a cooperative enterprise.

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

Leonid Bershidsky is a Bloomberg View columnist. He was the founding editor of the Russian business daily Vedomosti and founded the opinion website Slon.ru.

To contact the author of this story: Leonid Bershidsky at lbershidsky@bloomberg.net.

To contact the editor responsible for this story: Mike Nizza at mnizza3@bloomberg.net.

For more columns from Bloomberg View, visit http://www.bloomberg.com/view.