ADVERTISEMENT

Fake News: Misguided Policymaking To Counter Misinformation

Why the government’s internet intermediary draft rules fail to address fake news, and impact other fundamental freedoms.

Social media application icons are displayed on an Apple iPhone. (Photographer: Brent Lewin/Bloomberg)
Social media application icons are displayed on an Apple iPhone. (Photographer: Brent Lewin/Bloomberg)

Fake news is not a new phenomenon. However, it has dominated headlines in the aftermath of Brexit and the U.S. presidential elections in 2016, and allegations of influencing elections and fomenting violence and genocide over the following years. Fake news—classified more formally as disinformation and misinformation—has changed over time, in terms of the means by which it is spread, and its exponentially larger scope and impact.

Closer home, doctored videos routinely incite violence, and false information is used to target dissidents or fuel political propaganda. This is could be via innocuous posts circulated on social media and messaging apps by friends and family, or by dedicated groups with the sole target of maliciously spreading disinformation. The origin of these messages remains anonymous and nearly impossible to trace, and blame is inevitably laid on the platforms on which this content is posted. But should these intermediaries be held liable for content generated by third parties?

With this in mind, the Ministry of Electronics and Information Technology recently released the Draft Information Technology [Intermediaries Guidelines (Amendment) Rules] 2018 to amend the existing 2011 IT Rules. Section 79 of the IT Act, under which these Rules were notified, covers intermediary liability.

While ostensibly prompted by the need to address the epidemic of disinformation plaguing the country, it is unlikely that the draft rules as they stand today suitably address the issue at hand. Instead, they raise larger questions of censorship and surveillance of our activity online.
A user  holds a mobile phone displaying a fake message shared on WhatsApp in Gadwal, Telangana. (Photographer: Dhiraj Singh/Bloomberg)
A user holds a mobile phone displaying a fake message shared on WhatsApp in Gadwal, Telangana. (Photographer: Dhiraj Singh/Bloomberg)

At present, intermediaries are a very broad single category that consists of internet service providers, search engines, domain name system providers, interactive websites, website hosts, and even cyber cafes. Section 79 grants “safe harbour” to these intermediaries from legal action for third-party content as long as they observe “due diligence” while carrying out their duties, and follow the prescribed rules and guidelines.

Safe harbour principles protect intermediaries from the conduct of their users and were critical to the growth of the early internet. Between 2011 and 2015, intermediaries in India were obligated to take down content on receipt of a complaint by any aggrieved private party in order to qualify for safe harbour.

However, the landmark 2015 Shreya Singhal judgment narrowed the scope of this provision to only court orders or government takedown requests. Those were to be limited to the reasonable restrictions on freedom of speech and expression set by Article 19(2) of the Constitution. This placed limits on censorship by private parties to some extent and established some form of due process to be followed before takedowns.

The suggested amendments to these provisions via the draft Bill, therefore, warrant a closer look.

The draft bill is contentious for a number of reasons. The issue of excessive delegation of powers is immediately apparent, as subordinate rules must limit themselves to the parameters of the parent legislation, in this case, Section 79 of the IT Act.

Section 79 is an exemption clause, and thus serves exclusively to limit liability under the parent Act. It is questionable, then, whether such an exemption clause is also empowered to lay down certain substantive provisions, which may also fall afoul of other legislations. Both the draft amendment and the original 2011 rules must be reviewed from this perspective.

Opinion
Why Smart People May Be More Likely to Fall for Fake News

A Tall Order

The duties imposed on intermediaries have also become considerably more onerous. Intermediaries with more than 50 lakh users are now obligated to set up operation in India. Aside from the previously addressed question of excessive delegation, the number of users required seems arbitrary and rather high, and would exempt many smaller intermediaries from similar obligations.

The obligation to take down content within 24 hours is also an additional and unnecessary burden on intermediaries.

In addition, the categories of content that users cannot generate have been expanded to include anything that threatens “public health and safety”, promotion of nicotine or intoxicants – terms that are too broad and not defined, and bring to question whether posting a picture with a glass of wine would be a violation of this provision.

This also exceeds the constitutional limits set out by the Shreya Singhal judgment.

Intermediaries are also obligated to inform users of these conditions at least once a month, the frequency of which is unnecessary and could result in negative user experience.

The period for retention of unlawful content has been increased to 180 days, but can also be retained indefinitely as required by an investigating agency. There is no requirement to inform users about this retention and use of their data, and calls into question who actually owns this data in the first place.

Further, the requirement for traceability of user-generated content calls into question whether end-to-end encrypted services such as WhatsApp will need to decrypt content. While these concerns are valid, there has been some discussion about developments in traceability without breaking encryption.

However, the recent MHA notification which clarified which government agencies were allowed to intercept, monitor and decrypt any data highlighted the sweeping powers of surveillance already enjoyed by these agencies under existing law.

These obligations are of particular concern in the absence of any privacy legislation which can set parameters for their operation, and for the protection of users’ interests.
Icons for social media applications including the WhatsAp. (Photographer: Chris Ratcliffe/Bloomberg)
Icons for social media applications including the WhatsAp. (Photographer: Chris Ratcliffe/Bloomberg)
Opinion
WhatsApp Rolls Out TV Campaign In India To Tackle Fake News

A Legal Tangle

Another controversial provision calls for proactive monitoring of content by intermediaries, which could also be through the use of automated tools. This raises a number of concerns.

With “unlawful content” itself being ill-defined and overbroad, dependence on algorithms ignores the potential for bias and over-censorship of legitimate content. This once again places the power of censorship in the hands of private parties, instead of the courts or the government.

This provision might also be in violation of the parent law itself, as the question of whether proactive filtering—whether manually or automatically—constitutes editorial control is up for debate, and could potentially disqualify the intermediary from safe harbour protection.

Importantly, these rules fail to differentiate between different classes of intermediaries.

While moderating content may be easier on public-facing platforms, expecting ISPs and cyber cafes to observe the same level of diligence is unrealistic.

The draft rules fail to address ostensible concerns about disinformation or ‘fake news’, and land up impacting other more fundamental freedoms that had been previously established over time. It is most important to first pass comprehensive privacy legislation that grants users ownership of their data and brings more accountability to data controllers. These amendments also highlight the need to revisit the 2000 IT Act, and the 2011 rules and bring them in line with developments in technology and privacy before moving forward with this draft amendment.

Yesha Tshering Paul is the Programme Officer at the Centre for Communication Governance, National Law University, Delhi.

The views expressed here are those of the author and do not necessarily represent the views of BloombergQuint or its editorial team.