Content moderation through co-regulation

In today’s online environment, the existing government control on online speech is unsustainable as social media has millions of users

Updated - November 09, 2022 01:46 pm IST

Published - November 09, 2022 12:15 am IST

Mobile phone apps of social media platforms.

Mobile phone apps of social media platforms. | Photo Credit: AP

Social media platforms regularly manage user content on their website. They remove, prioritise or suspend user accounts that violate the terms and conditions of their platforms. When a user’s post is taken down or their account is suspended, they can challenge such a decision. Similarly, when users see harmful/ illegal content online, they can flag the issue with the platform. Some platforms have complaint redressal mechanisms for addressing user grievances. For instance, Facebook set up the Oversight Board, an independent body, which scrutinises its ‘content moderation’ practices.

The online ecosystem today

It was voluntary for platforms to establish a grievance redressal mechanism through their terms of service until the government introduced the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. These mandate platforms to establish a grievance redressal mechanism to resolve user complaints within fixed timelines. Recently, the government amended these Rules and established Grievance Appellate Committees (GACs). Comprising government appointees, GACs will now sit in appeals over the platforms’ grievance redressal decisions. This signifies the government’s tightening control of online speech, much like Section 69A of the IT Act. The IT Act was passed in 2000 and Section 69A was introduced in 2008 when social media barely existed.

In today’s online environment, however, the existing government control on online speech is unsustainable. Social media now has millions of users. Platforms have democratised public participation, and shape public discourse. As such, large platforms have a substantial bearing on core democratic freedoms. Further, with the increasing reach of the Internet, its potential harms have also increased. There is more illegal and harmful content online today. Disinformation campaigns on social media during COVID-19 and hate speech against the Rohingya in Myanmar are recent examples. With increased stakes in free speech and with increasing online risks, a modern intermediary law must re-imagine the role of governments.

A modern intermediary law

Under such a law, government orders to remove content must not only be necessary and proportionate, but must also comply with due process. The recent European Union (EU) Digital Services Act (DSA) is a good reference point. The DSA regulates intermediary liability in the EU. It requires government take-down orders to be proportionate and reasoned. The DSA also gives intermediaries an opportunity to challenge the government’s decision to block content and defend themselves. These processes will strongly secure free speech of online users.

Most importantly, an intermediary law must devolve crucial social media content moderation decisions at the platform level. Platforms must have the responsibility to regulate content under broad government guidelines. Instituting such a co-regulatory framework will serve three functions. First, platforms will retain reasonable autonomy over their terms of service. Co-regulation will give them the flexibility to define the evolving standards of harmful content, thereby obviating the need for strict government mandates. This will promote free speech online because government oversight incentivises platforms to engage in private censorship. Private censorship creates a chilling effect on user speech. In turn, it also scuttles online innovation, which is the backbone of the digital economy.

Second, co-regulation aligns government and platform interests. Online platforms themselves seek to promote platform speech and security so that their users have a free and safe experience. For instance, during the pandemic, platforms took varied measures to tackle disinformation. Incentivising platforms to act as Good Samaritans will build healthy online environments.

Third, instituting co-regulatory mechanisms allows the state to outsource content regulation to platforms, which are better equipped to tackle modern content moderation challenges.

The modality of a co-regulatory model for content moderation must be mulled over. It is important that co-regulation, while maintaining platform autonomy, also makes platforms accountable for their content moderation decisions. Platforms as content moderators have substantial control over the free speech rights of users. Whenever platforms remove content, or redress user grievance, their decisions must follow due process and be proportionate. They must adopt processes such as notice, hearing and reasoned orders while addressing user grievances.

But due process is not enough. Social media content moderation tools are not limited to the removal or suspension of posts. Platforms often use tools for de-prioritisation of content to reduce the visibility of such content. Users are unaware of and unable to challenge such actions as they take place through platform algorithms that are often confidential. Platform accountability can be increased through algorithmic transparency.

An intermediary law should take the baton brought forward by the 2021 Rules. The GACs must be done away with because they concentrate censorship powers in the hands of government. A Digital India Act is expected to be the successor law to the IT Act. This is a perfect opportunity for the government to adopt a co-regulatory model of speech regulation of online speech.

Nikhil Pratap is a practicing lawyer in New Delhi. Shatakratu Sahu works at Carnegie India, focusing on technology-policy issues. Views are personal

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.