Attempts at technological filtering usually even affect speech that is not targeted by the filtering mechanism
On May 5, the Supreme Court will hear Kamlesh Vaswani’s infamous anti-pornography petition again. The petition makes some rather outrageous claims. Watching pornography ‘puts the country’s security in danger’ and it is ‘worse than Hitler, worse than AIDS, cancer or any other epidemic,’ it says. This petition has been pending before the Court since February 2013, and seeks a new law that will ensure that pornography is exhaustively curbed.Disintegrating into binaries
The petition assumes that pornography causes violence against women and children. The trouble with such a claim is that the debate disintegrates into binaries; the two positions being that pornography causes violence or that it does not. The fact remains that the causal link between violence against women and pornography is yet to be proven convincingly and remains the subject of much debate. Additionally, since the term pornography refers to a whole range of explicit content, including homosexual adult pornography, it cannot be argued that all pornography objectifies women or glamorises violent treatment of them.
Allowing even for the petitioner’s legitimate concern about violence against women, it is interesting to note that of all the remedies available, he seeks the one which is authoritarian but may not have any impact at all. Mr. Vaswani could have, instead, encouraged the state to do more toward its international obligations under the Convention on the Elimination of Discrimination against Women (CEDAW). CEDAW’s General Recommendation No. 19 is about violence against women and recommends steps to be taken to reduce violence against women. These include encouraging research on the extent, causes and effects of violence, and adopting preventive measures, such as public information and education programmes, to change attitudes concerning the roles and status of men and women.Child pornography
Although different countries disagree about the necessity of banning adult pornography, there is general international consensus about the need to remove child pornography from the Internet. Children may be harmed in the making of pornography, and would at the very minimum have their privacy violated to an unacceptable degree. Being minors, they are not in a position to consent to the act. Each act of circulation and viewing adds to the harmful nature of child pornography. Therefore, an argument can certainly be made for the comprehensive removal of this kind of content.
Indian policy makers have been alive to this issue. The Information Technology Act (IT Act) contains a separate provision for material depicting children explicitly or obscenely, stating that those who circulate such content will be penalised. The IT Act also criminalises watching child pornography (whereas watching regular pornography is not a crime in India).
Intermediaries are obligated to take down child pornography once they have been made aware that they are hosting it. Organisations or individuals can proactively identify and report child pornography online. Other countries have tried, with reasonable success, systems using hotlines, verification of reports and co-operation of internet service providers to take down child pornography. However, these systems have also sometimes resulted in the removal of other legitimate content.Filtering speech on the Internet
Child pornography can be blocked or removed using the IT Act, which permits the government to send lists of URLs of illegal content to internet service providers, requiring them to remove this content. Even private parties can send notices to online intermediaries informing them of illegal content and thereby making them legally accountable for such content if they do not remove it. However, none of this will be able to ensure the disappearance of child pornography from the Internet in India.
Technological solutions like filtering software that screens or blocks access to online content, whether at the state, service provider or user level, can at best make child pornography inaccessible to most people. People who are more skilled than amateurs will be able to circumvent technological barriers since these are barriers only until better technology enables circumvention.
Additionally, attempts at technological filtering usually even affect speech that is not targeted by the filtering mechanism. Therefore, any system for filtering or blocking content from the Internet needs to build in safeguards to ensure that processes designed to remove child pornography do not end up being used to remove political speech or speeches that are constitutionally protected.
In the Vaswani case, the government has correctly explained to the Supreme Court that any greater attempt to monitor pornography is not technologically feasible. It has pointed out that human monitoring of content will delay transmission of data substantially, will slow down the Internet, and will also be ineffective, since the illegal content can easily be moved to other servers in other countries.
Making intermediaries liable for the content they host will undo the safe harbour protection granted to them by the IT Act. Without it, intermediaries like Facebook will actually have to monitor all the content they host, and the resources required for such monitoring will reduce the content that makes its way online. This would seriously impact the extensiveness and diversity of content available on the Internet in India. Additionally, when demands are made for the removal of legitimate content, profit-making internet companies will be disinclined to risk litigation much in the same way as Penguin was reluctant to defend Wendy Doniger’s book.
If the Supreme Court makes the mistake of creating a positive obligation to monitor Internet content for intermediaries, it will effectively kill the Internet in India.
(Chinmayi Arun is research director, Centre for Communication Governance, National Law University, Delhi, and fellow, Centre for Internet and Society, Bangalore.)