Legislations and Policies

Information Technology Rules, 2021: The Damocles’ Sword Hanging Over Safe Harbour Protections.

Akshat Bhushan

lsprrr

The article focuses on the impact of the newly notified IT Rules, 2021 on the safe harbour provisions in India. It goes on to suggest alternatives to prevent the violation of the freedom of speech and to that effect argues that the power to take down or block access to content should stay with the judiciary and the Union Government, with the intermediaries merely assisting them in this process, without sitting in judgement of the content posted on their platform.

Introduction

Twitter has been in the eye of the storm for non-compliance with the newly notified Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (hereafter, Rules, 2021). Section 79 of the Information Technology Act, 2000 grants immunity to intermediaries from liability against third-party content, subject to certain conditions prescribed in the Rules made thereunder. Rules, 2021 has been criticised for a variety of reasons ranging from censorship of OTT platforms and digital news media to the introduction of the traceability requirement. However, this piece will focus on the impact these rules have on safe harbour provisions in India and suggest alternatives to mitigate the widespread dissemination of objectionable content on the internet, without violating the freedom of speech.


Requirements under Rules 2021 to avoid liability

The intermediaries are required to observe due diligence under Rule 3 and 4 of Rules, 2021 to avail protection under Section 79. If the intermediaries fail to comply with these Rules, then the Chief Compliance Officer will personally be held liable under Rule 4(1)(a) in proceedings against third-party content. Rule 3(1)(c) allows intermediaries to voluntarily disable or block the contents or accounts of users on grounds mentioned in Rule 3(1)(b). These grounds, especially the ones given in Rule 3(1)(b)(ii) are ambiguous. Rule 4(4) goes a step further by prescribing the deployment of technologies including automated tools for identifying content that depicts rape, child sexual abuse, or content previously blocked by the Government or on Court orders.  The word ‘endeavour’ is preceded by ‘shall’ and not ‘may’ thereby making this sub-rule obligatory rather than advisory in nature. India and jurisdictions like the EU have recognised the inadequacies of such technologies. In the Scarlet and Netlog cases, the CJEU reaffirmed Article 15 of the E-commerce Directive, 2000 which prohibits general monitoring. In 2015, the Indian Supreme Court also ruled against general monitoring obligations on intermediaries. But, in 2017, the Court moved away from the aforementioned principle by demanding search engines to remove pre-natal sex determination advertisements. The judgment should be considered per incuriam as it did not even refer to the 2015 judgment. 


Muzzling free speech

Automated tools fail to perceive irony, humour, or parodies and therefore are not appropriate to be used for blocking or flagging content. In 2016, Facebook infamously took down a picture showing the ‘Napalm Girl’, an iconic image from the Vietnam War. The automated tools identified the picture as showing nudity and mechanically removed it without understanding the context.

Determining matters like defamation and hate speech requires a thorough investigation into the facts and circumstances of a case. The South Korean Supreme Court held the intermediaries liable for not deleting defamatory content in a case that alleged that a woman committed suicide because her boyfriend deserted her during her pregnancy. After this case, the wary intermediaries in South Korea started over-censoring content to avoid liability. The requirement to remove content identical to the ones removed on orders of the Court or Government before is imprecise. It is possible that in the past certain defamatory content or hate speech was removed at the direction of the Courts or Government order. But, for the intermediaries to determine whether a subsequently posted content is similar to it, they would have to assume an investigatory or judicial role. Consequently, the intermediaries may start taking down content merely because the content in question deals with a subject matter similar to a previously blocked content while ignoring the intent, context, and purpose of the content.  

The provisos to Rule 4(4) state that action taken under that sub-rule should not violate privacy and free speech and a human oversight should be maintained over the tool along with a regular review of those tools. However, in absence of guidelines, these provisos are not only toothless but can also be used conveniently by the Government to reprimand intermediaries for taking down content that is favourable to the ruling party. Conversely, the Government may use Rule 4(4) when the intermediary does not take down content that is unfavourable to the ruling party. As a result, the intermediaries may go out of their way to exercise powers under Rule 3(b) and 3(c)  to remove content critical of the ruling party to appease the Government of the day.


The Notice and Notice Model to address civil claims

Civil claims against third-party content can be handled through a modified version of the notice and notice model that is present in Sections 41.25 and 41.26 of Canada’s Copyright Modernisation Act. The complainant should send a detailed notice to the intermediary highlighting the alleged infringing content. The notice would then be forwarded to the alleged wrongdoer within a particular period of time after which they may remove the content or send a counter-notice to the intermediary within a prescribed time period that will be forwarded to the complainant. Subsequently, the complainant may take the matter to the courts or law enforcement agencies and the intermediary should be required to retain the relevant details that will help in identifying the alleged wrongdoer for a prescribed period of time. Alternatively, if the alleged wrongdoer fails to respond to the complaint within the prescribed time, the intermediary should remove that content. This would be in line with the 2011 Report of the UN Special Rapporteur as well as the landmark 2015 Indian Supreme Court judgment which firmly cautions against granting wide-ranging censorship powers to private entities.


Need for closer coordination between intermediaries and Government agencies to counter Non-consensual Intimate Images  (NCII) and violent content online

Hate speech, NCII, and extremist content have undeniably found a much wider reach because of the internet. Such serious transgressions cannot be addressed merely through a notice and notice model as suggested above. For such cases, social media intermediaries must appoint a Grievance Officer who would promptly forward these complaints within a prescribed time to the Nodal Officer, who would then quickly coordinate with the law enforcement agencies. The intermediary would be liable to pay compensation if these officers are not appointed or if these officers fail to act within the prescribed time period. However, the intermediaries should not judge and remove content posted on their platforms. The Rule can also incorporate a “Good Samaritan” provision wherein the intermediaries may voluntarily choose to monitor content. If they find something objectionable, then they may forward the same to the relevant Government agencies. However, this should be a completely voluntary exercise having no effect on the safe harbour. After quickly coordinating with the Nodal Officer, the enforcement agencies may preferably get an order from the Court or, if there is an emergency then block the content immediately. But such emergency measures should also be confirmed by the Courts later within 48 hours. Here, there might be concerns that this will increase the burden on the judiciary. Therefore, the Union Government should be allowed to block access to contents without having to approach the Court every time. But before that happens, some sweeping changes are required in the law.


Reforms in IT Blocking Rules, 2009

The power of the Government to block content is presently governed by Section 69A and the 2009 Rules made thereunder. However, the same is replete with a number of issues that seriously impinge on free speech. Under Rule 7, a Committee is required to examine the justifiability of the request made by the Nodal Officer to block access to information. Rule 14 also provides for a Review Committee which may order for unblocking certain information where it finds that the direction issued by the Committee under Rule 8(6) was not in conformity with Section 69A(1). However, both the Committees mentioned above are solely constituted by bureaucrats serving under the Union Government. It would be desirable to include former High Court and Supreme Court Judges appointed by an Appointment Commission comprising the Chief Justice of India, Prime Minister, and Leader of Opposition of Lok Sabha. This would imbue the Committee with a sense of independence and judicial expertise. The controversial Rule 16 of the 2009 Rules, requires strict confidentiality to be maintained with respect to the complaints and requests made and action taken due to which content on social media is taken down without even informing the reasons for the same. The Apex Court disappointingly upheld the validity of the 2009 Rules in 2015 stating that the Rules provided enough safeguards against arbitrary State action. However, Rule 8(6) and Rule 16 do not even conform to Section 69A under which they have been made. It is a well-established principle of law that a delegated legislation cannot go beyond the ambit or supplant the provisions of the Parent Act under which it was issued.  Section 69A mentions the words: “for reasons to be recorded in writing, by order, direct any agency of the Government or intermediary to block….”. However, IT Blocking Rules in Rule 8(6) does not mandate the Committee to issue a written order stating reasons for blocking access. Moreover,  Rule 16 makes it mandatory to maintain confidentiality with respect to the action taken by the Committee. Therefore, Rule 8(6) and Rule 16 are ultra vires of the Parent Act itself. It must be made mandatory for the Committee to issue a detailed order which is available in the public domain. This would ensure that the Government does not get a free hand to suppress free speech and prevent intermediaries from censoring content.  


Conclusion 

The safe harbour provisions are necessary for the internet to remain a safe space for the free flow of information and ideas. The power to take down or block access to content should stay with the judiciary and the Union Government, with the intermediaries merely assisting them in this process, without sitting in judgment of the content posted on their platform.


Akshat Bhushan is a third-year student at the Hidayatullah National Law University, Raipur.