In formally outlining the crux of the proposed Digital India Act, 2023, the Minister of State, IT, Rajeev Chandrasekhar, made a case for a robust replacement of the IT Act, 2000, which is somewhat obsolete now. He ominously added a question that the government sought to revisit: “should there be a ‘safe harbour’ at all for all intermediaries?” This acquires significance as the government has been working towards increasing the compliance burden on Internet intermediaries, in particular in the IT Rules 2021 and its later amendments. These Rules themselves had put the onus on social media intermediaries to arbitrate on content on their platforms with regulations that were weighted in favour of the government of the day, and had invited legal appeals as digital news media platforms among others questioned the constitutionality of the Rules. Meanwhile, an amendment in October 2022 provided for government-appointed committees that will adjudicate on an individual user’s appeals against moderation decisions of these intermediaries. In January 2023, the IT Ministry proposed an amendment on the take down of social media/news content that has been marked as “fake” or “false” by the Press Information Bureau or any other government agency. These, in sum, had already put the safe harbour protections for intermediaries at much risk.
Regulation of hate speech and disinformation on the Internet is a must and intermediaries, including digital news media and social media platforms, have an accountable role to play. The IT Rules’ specifications on giving users prior notice before removing content or disabling access, and for intermediaries to come up with periodic compliance reports are well taken. Social media intermediaries should not shut down users’ posts or communications except in the interests of public order and to avoid legal consequences. But care should be taken to ensure that requirements on intermediaries should not become needlessly onerous and punitive, which also vitiate the principle of safe harbour. There is a legitimate concern that the government is keener on regulating or taking down critical opinion or dissent in social media/news platforms than hate speech or disinformation, which in many cases has originated from representatives of the state. Safe harbour provisions, in particular Section 230 of the U.S. Communications Decency Act, 1996, that explicitly provided immunity to online services with respect to user-generated content had gone a long way in catalysing the Net’s development. While modern regulations to tackle issues related to misinformation, problematic content and the side effects of the new form of the Internet are a must, they should still retain first principles of safe harbour without whittling down their core.
To read this editorial in Kannada, click here.
To read this editorial in Malayalam, click here.
To read this editorial in Telugu, click here.
To read this editorial in Tamil, click here.
To read this editorial in Hindi, click here.