Key points missed in the amendment proposals of IT rules

Last year, India’s Ministry of Electronics and Information Technology (MeitY) and the Ministry of Information and Broadcasting (MIB) attempted to set up an inter-ministerial appellate body on digital news publications and over-the-top (OTT) platforms. This Committee may modify or remove the material as it deems fit. The Bombay and Madras High Courts noted that such an oversight mechanism threatened the freedom of the media, and ordered a stay on the operation of this panel. Now, through a proposed amendment to the 2021 IT Rules, MeitY wants to set up a body on top of social media platforms, which has a similar effect, will be able to decide what kind of speech stays on the internet, what is to be taken should down, and what gets restored. The amendments also seek to impose additional obligations on social media platforms.

A Grievance Appeal Committee (GAC) would act as a monitoring mechanism for grievance redressal officers, who were required to be appointed by the social media platform under the 2021 rules. These have proved controversial and their constitutional validity was challenged before several high courts. It is in this context that the Bombay and Madras High Courts barred the operation of digital news publishers and parts seeking to set up a monitoring mechanism on OTT platforms, warning that it “could rob the media of its freedom”. Is”. Even though MeitY can. Do not refer complaints directly to the panel, the appeal process may yield a similar result.

Then why does the ministry believe that this GAC is constitutionally strong? It also seeks to create one without legislative backing, as the amendment would be made by the ministry in its own rules, and not by Parliament in a statutory law. In a democracy like India, the executive does not have the power to create bodies like the GAC, which can have immediate and far-reaching implications for the fundamental rights of citizens, with little or no procedural safeguards in the scheme of rules. ,

Apart from issues of constitutionality, a GAC ​​also does not reflect sound policymaking. This is indicative of an approach to content moderation that is neither suitable nor capable of scaling up to meet the many challenges in today’s information ecosystem. It depends on the decisions taken about disassembling pieces of material to address systemic issues caused by problems of a wider societal level, what Harvard Law School lecturer Evelyn Douc calls ‘accountability theatre’. Mere aggregation of individual decisions will not be able to address the underlying problems as they are neither repeatable nor widely applicable, given the complexities involved.

Meanwhile, the potential lack of scalability can be demonstrated with some numbers. A popular Indian social media platform reported that it received around 7 million user complaints in March 2022. Even if about 1% of these are with the GAC, the panel may need to deal with at least tens of thousands of appeals a month. Once other social media platforms are taken into account, this number can be much higher and more people want to exercise this option, whether in good faith or bad. It is neither desirable nor advisable for this committee to attempt to operate on such a scale. Efforts by the executive to involve themselves directly in content moderation decisions are also unlikely to have fair results.

These amendments also introduce a new requirement for social media platforms. Until now, they were only required to inform their users about, among other things, the kind of content they could not host, display, publish, etc. An amended clause requires that these platforms not do so to their users. It is unclear how intermediaries are to comply with these obligations and whether this will translate into ‘general surveillance obligations’, where they must actively scan all content. This can disproportionately affect ‘politically inconvenient’ speech.

Social media platforms may run the risk of losing their intermediary protection under the Information Technology Act if they fail to comply with the directions of MeitY or GAC and fall short of their obligations by the court. Intermediary security is essential because millions of different pieces of speech material are generated on multiple platforms every day in many different ways, making it extremely difficult for users to track what users say or do on their platforms. It is also extremely difficult for them to exercise complete control over what they choose to do. producing adverse results. These protections allow the platform to respond only to government orders or court directions to remove content. This model is recognized globally and even the Supreme Court had laid down the same in Shreya Singhal’s judgement. The proposed amendments attempt to reverse years of jurisprudence on arbitrator protection as well as the Supreme Court ruling. It is not allowed under the constitutional scheme of India.

Apart from a by-law, which states that social media platforms must respect the constitutional rights of Indians, which does not seem to be practically or judicially enforceable, the latest proposals have provided an opportunity to reform a delegated law. lost which was widely criticized by stakeholders, activists, journalists, artists and the general public. The government should not only withdraw the proposed amendments, but also completely repeal the 2021 IT Rules and hold fresh consultations with civil society and other stakeholders, so that people are not put first.

Prateek Waghre and Tanmay Singh, respectively, are Director of Policy and Senior Litigation Counsel at the Internet Freedom Foundation

subscribe to mint newspaper

, Enter a valid email

, Thank you for subscribing to our newsletter!