Shubham Jain
Introduction
There has been an upsurge in cyber crimes against women amidst Covid-19. One of the most controversial one being the one where miscreants hailing from Delhi were sharing objectionable material of minor girls over a group named ‘bois locker room’ (the incident is hereinafter referred to as the “Boys Locker Room”). Delhi Police Cyber Crime Cell and the Delhi Commission for Women have taken suo motu cognizance of the Boys Locker Room incident. It is a common knowledge that it is not an isolated incident such groups and pages are easy to find on social media platforms. However, something that definitely catches one’s attention upon bare perusal of the screenshots is the discussion about changing their social media handles, bio, and the profile picture after the screenshots started drawing a lot of attention. The screenshots also show messages from the deleted Instagram accounts. This leads us to a larger question: What is the role of social media platforms such as Instagram, Facebook, and Snapchat in restricting such criminal acts, and assisting in the investigation?
Role of social media platforms and safe harbor clause
The social media platforms, which facilitate publication of user-generated content, as well as the internet service providers (“ISPs”) are known as intermediaries, as defined under the Information Technology Act, 2000 (“IT Act”).[1] The intermediaries are not held accountable for any unlawful user-generated content, if the intermediary holds no editorial control over the user generated content.[2] This is known as the safe harbor provision. This safe harbor provision bestows on the intermediary some protection against liabilities for user generated content. Understandably, the social media platforms cannot be held liable for the user-generated content on their platforms, and cannot be expected to respond to all such request and complaints by millions of users. They only required to inform the user through their terms and conditions, or user agreement that publication of content which is “obscene, pornographic, libelous, invasive or other’s privacy”,[3] or “harm minors in any way”[4] is prohibited. However, intermediary’s responsibility is only limited putting such clauses in their user agreement, and they are not legally responsible to take any actions against publication of such data, unless it is brought to their ‘actual knowledge’. Further, intermediaries are requires to preserve such information and associated records for at least ninety days for investigation purposes.[5] The records related to such information shall be disclosed to the government agencies upon receipt of an order in furtherance of the legal obligation, even if the data includes sensitive information.[6] This rule also empowers administrative/quasi-judicial bodies to obtain the records of such information, as it does not between orders issued by such bodies or the judicial bodies, as long as it is backed by the law.
However, end-to-end encryption messaging services such as WhatsApp have been known to pose difficulties for the security agencies to intercept and decrypt such information. WhatsApp claims that it does not retain any logs of such information; but it does produce records of identities, location, telephone number, and contact details when such information is subpoenaed by the courts, pursuant to the legal requirement of assisting the law enforcement agencies.[7]
Striking the right balance – Fundamental Rights and Judicial Intrusion
In light of the Fundamental Right to Freedom of Speech and Expression and reasonable restrictions thereto, the Apex Court set the benchmark as the restrictions enumerated under Article 19(2) of the Indian Constitution and observed that the intermediaries would be considered to have ‘actual knowledge’ upon receipt of a notice by an appropriate government agency, or a court order.[8] Therefore, even if the content over any platform is reported by users, the intermediary is not bound to take the content down unless a relevant government agency orders them to do so. When users report any content on various grounds available, the intermediary’s decision to act upon it is subjected to its policy, user agreement and community standards. Absence of a legal mandate to act on such user-reported content, the need for approaching the authorities, judicial procedure, and fear of societal implications leads to unreported cases. This does not help the cause of preventing social media platforms from becoming a virtual ‘boys locker room’.
Certain amendments to the Information Technology (Intermediary Guidelines) Rules, 2011 were proposed by the Ministry of Electronics and Technology in 2018, which can help to prevent similar incidents from taking place. The proposed Rule 3(9) required the social media platforms to deploy “automated tools … to proactively identify, remove or disable public access to unlawful information and content.”[9] The amendment has been heavily criticized by various organizations primarily on three grounds: [a] its practical implications may result in abrogation of the Freedom of Speech and Expression; [b] the terms used in the proposed amendment are vague, and have a very wide scope; and [b] the current technical advancements are not sophisticated enough to determine meaning the same way as a human agency can. However, it has been held that the intermediaries need to be more diligent and use measures such as content filters to restrict publication of obscene or pornographic material on the platforms.[10]
Further, another proposed amendment prescribes that the “intermediary shall enable tracing out of such originator of information on its platform as may be required”.[11] Such a provision may also require end-to-end encryption messenger services such as WhatsApp to record and store data. Understandably, it has been criticized on inter alia two grounds: [b] possible encroachment on people’s fundamental right to privacy; and [b] usage of vague and terms with a very wide scope.
Taking ideas from such aforementioned proposal, and lessons from incidents such as the ‘Boys Locker Room’ incident, it is recommended that a notification containing guidelines incorporating the proposed amendments. The guidelines can mandate the use of automated tools based on artificial intelligence, and machine learning to red-flag the content related to, inter alia, rape threats, sexual harassment, publication of obscene, and pornographic material, and materials that may harm minors in any such form. Further, it is recommended that a mandatory supervision by a human agency over the red-flagged content and take decision about the course of action. A similar mode can also be used to monitor the user-reported content. Application of such guidelines may be extended to platforms such as Facebook, Instagram, WhatsApp, Snapchat, LinkedIn, Tinder, etc. or any such intermediary based on the active monthly users.
Further, a set of guidelines governing the end-to-end encryption messaging platforms must also be introduced which would require messaging services to record and maintain a record of communication metadata. A decryption key must also be mandatorily made available to track the messages; this would help substantially in the event of destruction of evidence on the personal devices. However, considering that such an option of decryption may lead to encroachment on privacy of the users, it is critical that such provision must be available only under strict judicial scrutiny and supervision in cases related to, inter alia, sexual harassment, revenge porn, and morphed photos.
Needless to say, the terms used to draft such guidelines must not be vague nor have a wide scope. There is a very thin line segregating the necessity to prevent cases similar to the the Boys Locker Room incident, and encroaching on the freedom of speech and expression, and right of privacy of the users.
Conclusion
Incidents related to crimes like sexual harassment, morphed images, revenge porn, and rape threats are common in the cyberspace, and the victims often suffer in silence. It is time to employ measures that prevent such crimes from happening, or at least minimize the suffering of the victims. Although criminal complaints can be filed against the perpetrators, it is often avoided. However, if correct policy measures and technological advances on part of the government and the intermediaries respectively, not only suffering of the victim can be mitigated, but social media can become a free space for exchange of opinions and ideas. It is important to keep in mind while walking the thin line between encroaching upon civil liberties, and protecting the interests of victims; but it is important to take a step before the social media platforms turn into a boys locker room.
[1] Section 2(1)(w), Information and Technology Act, 2000.
[2] Section 79, Information and Technology Act, 2000.
[3] Rule 3(2)(b), Information Technology (Intermediary Guidelines) Rules, 2011.
[4] Rule 3(2)(c), Information Technology (Intermediary Guidelines) Rules, 2011.
[5] Rule 4, Information Technology (Intermediary Guidelines) Rules, 2011.
[6] Rule 6, Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011.
[7] Rule 3, Information Technology (Procedure and Safeguards for Interception, Monitoring and Decryption of Information) Rules, 2009.
[8] Shreya Singhal v. Union of India, (2015) 5 SCC 1,
[9] Draft Rule 3(9), Information Technology [(Intermediary Guidelines) Amendment] Rules, 2018.
[10] Avnish Bajaj v. State, 150 (2008) DLT 769.
[11] Draft Rule 3(5), Information Technology [(Intermediary Guidelines) Amendment] Rules, 2018.
The Author is a student at National Law University, Jodhpur
Click on the book images below to make your Amazon purchases. All affiliate commissions earned are donated to Stranded Workers Action Network.
Categories: Law and Technology
1 reply »