ADVERTISEMENT

Data Protection: Moving Beyond Consent 

The law we draft can allow us to leverage the benefits of data while protecting us from the harms.

A Cambridge Analytica symbol is displayed on an Apple Inc. iPhone against a backdrop of the Facebook Inc. sign (Photographer: Luke MacGregor/Bloomberg)
A Cambridge Analytica symbol is displayed on an Apple Inc. iPhone against a backdrop of the Facebook Inc. sign (Photographer: Luke MacGregor/Bloomberg)

This is the second article in a series on the future of data protection in India. Read the first here.

The fact that we don’t have privacy laws in India is a great opportunity to build one from scratch. Consent need not be its cornerstone.

We are deep in the middle of the data age. Information is being collected from us in more ways that we can count. As we walk around, the phones in our pockets and the watches on our wrists collect information about where we are and have been. Dozens of sensors – gyros, accelerometers, photoplethysmographs – record how vigorously we are walking, how fast our heart is pulsing and what sport we are engaged in, constantly uploading that information into the cloud where it is parsed and processed by different services to generate intimate data about our personal well being.

Digital assistants embedded in our devices pore through our calendars and contacts correlating that information with our location and the prevailing traffic conditions, to help us plan our daily activities and get to our meetings on time. Always-on smart speakers lie about in our houses listening to everything that is being said, improving their understanding of who we are and what we need – in order to help them help us.

Our social interactions have migrated almost entirely online to the point where we openly share information we’d otherwise have kept private on the largest public noticeboards ever created, trusting the discretion of platforms to keep us safe. Or not caring at all for the consequences.

Our government and corporations – institutions that we deal with daily like the tax departments, banks, telecom companies – are incessantly hoovering up information about us, who we speak with, the transactions we undertake, the places we go and the information we file. Hospitals, clinics and even our friendly neighborhood GP store our medical data online and in the cloud, recording symptoms and the medicines we have been prescribed for retrieval at a later point in time. Never before in history have we bled information more freely and in all directions.

At the same time, it is undeniable that there are many benefits that flow to us as a result of this data explosion. Those of us who have embraced these cloud technologies find it hard to function without the convenience it offers. Data-driven decision making has had an undeniable impact across a wide range of sectors – many of which will be truly transformative. It will power flow-based lending bringing millions of otherwise ineligible people into the banking system. It will inform our medical interventions allowing us to be more appropriately responsive in the context of epidemics and in the discovery of cures for illnesses. While it is possible to see a dark cloud around every silver lining that technology offers, it is hard to deny the benefits that it can bring.

This is the context within which India, for the first time in its constitutional history, has to frame a comprehensive data protection legislation.

We are in the unique position of being one of the very few nations in the world that still has no privacy law.

While some might see this as a failure on the part of the legislature to enact essential legislation, the absence of a formal legal framework gives us unprecedented latitude to craft one anew, unencumbered by the baggage of the past. It allows us to legislate using concepts that other countries, constrained by the path dependence inherent in having regulatory frameworks that do not account for the complexities of the modern data age, are incapable of deploying.

If we are perspicacious, the law we draft will allow us to leverage the benefits of data while protecting us from the harms.

Building A New Data Protection Law

So what are the elements that should make up this law? What do we borrow from other countries who have had experience of dealing with privacy for decades, and what do we build ourselves?

Most privacy laws around the world are built on the foundation of consent. They operate on the presumption that no-one can collect personal data without first obtaining consent. This is why we have to agree to privacy policies before we sign up to a service – our acceptance of these terms serving as the consent that is required under law.

The trouble is that we rarely read the terms of what we sign up, and even if we do, are incapable of fully appreciating the implications. Once collected, our data will be used in ways that we are unlikely to have anticipated when we signed up or when used will be combined with other services in ways we could not have foreseen. In all these instances it is virtually impossible to appreciate in advance the privacy consequences of the collection and utilisation of data. What purpose does our consent serve in this context – other than operating as a shield protecting the entities collecting our information from liability?

The reason we are in this anomalous situation is the information asymmetry inherent in the data economy. Between the data subject and the data collector, it is the latter who is more likely to understand how the data being collected will be processed and used, and who will have the ability to control how this data will be used. Data subjects will always provide their consent based on a limited understanding of the facts and the situation. If we are to devise a framework that truly protects personal privacy, it stands to reason that the entity principally responsible for ensuring that privacy is protected is not the data subject but the data controller.

We will need to find a mechanism to hold data controllers liable for what they do with the data – particularly when that use results in harm to the data subject.
India’s new data protection law should be based on the principle of accountability.

It should eschew the reliance on consent as the sole means of ensuring privacy protection, recognising that any consent provided is unlikely to be informed. Data subjects should have the right to hold the data controllers liable for the harms that they cause to them as a consequence of an improper use of the data that they have collected and it should not matter that the data controller has previously collected consent for such use. To be clear, I am not suggesting that we do away with the data subject’s right to determine, of his own free will, whether or not to avail of a service. Consent will always be necessary in order to sign up to a service and that consent must be separately obtained under all circumstances. I am merely suggesting that consent additionally obtained at the time of signing up to a service in respect of the use of personal information should not be allowed to be used as a shield against the liability of the data controller for harm done to the data subject.

I am mindful of the fact that harm is hard to detect. Today, data is often processed using algorithms with little to no human intervention. Where processing takes place using neural networks, it is not humanly possible to understand how the algorithms work as they are designed to operate as self-learning black boxes applying weightages to different parameters using methods inscrutable to external eyes. Even otherwise, algorithms tend to be complex, their purpose shrouded in mystery and often kept secret by the data controller to preserve their competitive advantage or to ensure that they can’t be gamed. In all these circumstances it is very difficult for the lay person to detect harm – at least not until it is too late.

Accordingly, the new data protection regime needs to offer a mechanism by which harm can be detected early enough to head off significant adverse consequences.

To achieve this, I propose the introduction of a new participant into the data ecosystem – a third-party data auditor (who I call the learned intermediary) whose function it is to detect as early as possible the potential harms that algorithms could cause to data subjects. These learned intermediaries would be persons knowledgeable in the way in which modern applications and services collect and process data, adept at understanding why the application collects data and how it is likely to be using it. Their function would be to evaluate these services, probing their design and functionality to identify privacy destructive behaviour. Where the algorithms are designed as a black box they should try to detect bias on the margin before the consequences become too widespread.

These auditors already exist today in the form of public spirited individuals who make it their mission to probe applications for vulnerabilities publishing the results of their investigations on the web to forewarn users of the implications of signing up to these services. Advanced users know to rely on these reports to decide whether or not to use these applications and how to modify their behaviour online to safeguard themselves against harm. Since this sort of audit is unofficial and disorganised, this information has only limited circulation and its benefit is marginal.

I propose that India’s new law should legitimise and institutionalise this process using these experts to reduce, as far as possible, the information asymmetry that exists in the data economy.

The Question Of Autonomy

Finally, there is the question of autonomy. At the core of the notion of privacy is the need to ensure that every person has control over his personal data and the ability to determine the outcomes when it comes to personal privacy. Most privacy laws promote autonomy by creating a threshold condition that requires the data controller to seek the consent of the data subject before collecting personal data from him. However, as we have seen above, this power exists in name alone as the consent that we provide is not meaningful and, at best, does lip service to the notion of autonomy.

We neither understand the consequences of the consent we provide nor do we spend the time and effort to read though the terms and conditions on which that consent is based before we accept. To my mind, this sort of threshold consent is a feeble – largely psychological – assurance of autonomy in a world where data collection is ubiquitous.

If we assume that, despite our best efforts, our data is going to be collected from us and there is little we can do to prevent it, we will need to focus our efforts on ensuring that the data subject has every opportunity to exercise his autonomy after the data has been collected. The first step in this direction is making the data subject aware of who has collected his data and what is being done with it. This task is best left to the learned intermediary who has the technical knowledge required to carry out such an investigation. Once the data subject gets to know that his personal information is being collected or how it is being processed, he must be provided with the tools necessary to determine, at his sole discretion, what should be done with it. This should include the right to require the data controller to stop processing all or a part of the information.

At the same time, the data subject should have the absolute right to require the data controller to transfer his data from one service provider to another.

At present data controllers assume that collection and processing gives them a right over personal data of the subject and on that basis make it difficult for a data subject to request for and effect a transfer. Patients who are looking for a second opinion often have to fight an uphill battle with hospitals who resist the transfer of data from their systems to those of a competitor hospital.

The new privacy law should place the data subject at the centre of the equation, giving him the absolute right to require the transfer of information from one service provider to another.

It’s Time To Move Beyond Consent

I would propose that these principles should form the kernel of our new data protection law. I am mindful of the fact that these principles are not commonly found in privacy laws around the world and that we will come under pressure from our international trading partners who would prefer that we conform to global standards of privacy regulation. To them, I would argue that time has, in fact, tested the consent model and found it wanting. Our modern data intensive world has exposed the weakness of the model and our continued dependence on consent as the gatekeeper for personal privacy is unconscionable. We have an opportunity to create a modern privacy law in our country with the unique advantage of being able to do so without being constrained by the path dependence that comes from a pre-existing framework of data regulators, binding corporate rules and safe harbours. We would do well to seize that opportunity and draft a modern privacy law that can be the example that other countries follow.

This article was originally published on Pragati.

Rahul Matthan is a partner with Trilegal and heads its Technology Media and Telecommunications practice.

The views expressed here are those of the author’s and do not necessarily represent the views of BloombergQuint or its editorial team.