OPINION

Should WhatsApp be held accountable for lynchings?

WhatsApp needs to change its platform to enable messages to be either public or private

YES - Nikhil Pahwa

Misinformation and propaganda have flooded our messaging apps and little is being done by law enforcement agencies, the government, and WhatsApp to fix this.

Primary responsibility

The primary responsibility to fix this lies with law enforcement agencies. A mob takes the law into its hands if it believes that either law enforcement agencies are incapable/unwilling to help or that its crimes will go unpunished. A lynching is a lynching, whether or not it was precipitated by a WhatsApp forward. Mob violence is not an act of nature: someone leads the mob and there is often politics behind such acts, perhaps even protection.

Law enforcement agencies shut down the Internet to prevent the forwarding of messages and possible riots. In 2017, according to data from the Software Freedom Law Center (SFLC), India had 70 Internet shutdowns. We’re halfway through 2018 and we have already reached that number. An Internet shutdown is a suspension of the constitutional right to free speech; a disproportionate act of censorship of all speech in response to the actions of a few. The data suggest that there are no shutdowns in Delhi, Mumbai and Bengaluru, while smaller towns bear the brunt of such actions. The lack of capacity of law enforcement agencies in smaller towns to deal with these situations is a worrying sign, especially in the run-up to elections. The data also indicate that the States with the maximum Internet shutdowns are where the BJP is in power or in a coalition: Jammu and Kashmir (before the government collapsed), and then Rajasthan, Haryana, U.P. and Gujarat.

State governments need to build law enforcement capacity and ensure prosecution in case of mob violence. A new law covering lynchings will be ineffective if our criminal justice system is incapable of enforcing the law. The Centre needs to do better while engaging with messaging and social media platforms: it took the Information Technology Minister, Ravi Shankar Prasad, till 2018 to ask WhatsApp about action being taken to address misinformation. This is despite the fact that three years ago, T.N. Seema, a Rajya Sabha MP, had asked the Home Ministry to clarify “the mechanism existing with government to deal with the danger of high-tech rumour-mongering kind of Internet-rumour-bombs which may lead to communal tension and fear among the common masses.”

It is important for platforms like WhatsApp to not be legally accountable for the messages being sent through them. That would amount to holding telecom operators accountable for the calls that you make. However, that doesn’t mean that WhatsApp isn’t responsible for helping ensure that users are held to account for their messages.

 

What WhatsApp should do

WhatsApp needs to change its platform to enable messages to be either public or private. Messages between individuals should remain private and not be those that can be forwarded. However, if a message creator wants to enable the forward ability of that message, the chat should be treated as public, and attributed with a unique ID linked to the original creator. This will allow WhatsApp to shut down such a message across its network once it is reported, and identify the creator when a court-directed request is made by law enforcement agencies. This will ensure accountability, allow the platform to remain neutral, and ensure that illegal speech is addressed. It’s important to remember that incorrect or false information is not illegal and people could be mistaken. It is messages with incitement of violence that need to be addressed. However, given the apathy from the government, law enforcement agencies, and WhatsApp, there is likely to be more mob violence and lynchings.

Nikhil Pahwa is the founder of MediaNama.com

***

No | Jaijit Bhattacharya

The government must educate the public and law enforcement agencies should do their job

We have come to witness the destructive power of social media. The obvious question is, who is responsible for these lynchings? The easiest thing to do is to find the most obvious entity in this chain of malicious videos being spread, and blame that entity. In this case, the entity is WhatsApp.

There is no doubt that mobile messaging platforms are in a powerful position to make significant interventions to prevent mob attacks that are arising out of what they themselves are facilitating. However, messaging platforms are only one actor in the chain of malafide content that is being spread.

Understanding the chain

The chain of malafide content being spread includes people who are creating such content (and are clearly investing significant time and perhaps money in doing this), mobile messaging platforms, people who are forwarding such content, people who are organising the mobs, and authorities who are responsible for maintaining law and order.

Let us look at the chain of spreading malafide content. First, there is a content creator. This is not the first time that mob frenzy has been triggered in India through a mobile messaging platform. A prominent case was in August 2012 when there was mass exodus of northeastern people from Bengaluru. Why did we have such a social media-led panic in Bengaluru? If an adversary is quickly learning how to spread hate from a city to the entire country, as is the case now, it is only a matter of time before the adversary’s next attack is on institutions. That would have a far greater destructive impact on the country. So, would it help if we only forced one mobile messaging platform to take steps to stop the spread of malicious videos? Yes, it may help for now, but the forces that seem to be getting better at social media-led attacks will use an alternative platform, just as they started with MMS for the mass exodus from Bengaluru and then moved to mobile messaging.

We must also keep in mind that India is perhaps the only place in the world where mobile messaging has led to such a widespread mass exodus and lynchings.

Vested interests

Why hasn’t the same happened in other countries? Clearly, one of the reasons is that such behaviour is being engineered by powers with vested interests that are detrimental to India. But there is also the fact that we have some uneducated, underexposed and gullible citizens who are living in a society with deep fissures and mistrust. We also have highly educated people — doctors, lawyers, engineers, etc. — who fail to understand the power of technology in creating chaos and who find it hard to differentiate truth from fiction.

It is necessary for the government to urgently educate the public. Similarly, enforcement agencies need to develop standard operating protocols to tackle such situations. Such a step needs to be reinforced by appropriate regulatory changes that make it mandatory for entities in the chain of information dissemination to share appropriate alerts with the law enforcement authorities, in a real-time electronic format.

In the absence of such a regulation, information intermediaries can neither be triggered to act, nor be held illegal for any acts of omission on their part.

Jaijit Bhattacharya is president, Centre for Digital Economy Policy Research

***

It's Complicated | Mishi Choudhary

The fixes are available if we can stop the blame game and work together

In the past two months, more than 20 people have been killed in attacks by mobs that have been provoked by messages on social media. Several media outlets are urging for some immediate action against WhatsApp without offering any concrete ideas. The government has warned WhatsApp’s parent company, Facebook, that it cannot evade “accountability and responsibility”. Meanwhile, WhatsApp has offered an award of $50,000 to anyone who can help stop the spread of fake news on its platform.

These cacophonous demands to hold WhatsApp as the sole responsible party for the lynchings underscores a classic response from our society: blame the messenger and avoid looking into the mirror. This problem actually has multiple facets.

Maintaining law and order

First, if people sometimes take the law into their own hands, it is because they believe that the government is unable to prevent violations of public order, that the government cannot investigate or prosecute those responsible and secure justice. Vigilantism is a consequence of this basic failure of the government in India. No government likes to admit that each lynching exposes this fundamental flaw, so it looks to blame other forces when attacks occur. Foreign platform companies are convenient to blame. Investigating those responsible for the crime that occurred in real life, not online, should not be complicated. If the perpetrators are brought swiftly to justice, the message that there is no impunity for mob justice will ring loud and clear. The demands for action should be directed at law enforcement agencies which have the responsibility of maintaining law and order.

Second, the responsibility of WhatsApp should be assessed with appreciation for how the platform actually works. This should not be used as an excuse to break encryption and deprive secure communications to users. There was a legal tussle between Apple and the Federal Bureau of Investigation over access to the iPhone used by a shooter in the San Bernardino shooting in 2015. These tussles between technology companies and the government do not have any good outcomes for the users.

Third, WhatsApp is responsible, as any business must be, for assessing the social risks it creates and for helping manage those risks. Short messaging is an immensely powerful social force, as advertisers, politicians, governments and wrongdoers have all learned. A system that broadcasts intense emotional signals must take account of its effects. That doesn’t mean regulating it out of existence, but we as a civil society have a right to expect careful analysis not only of business opportunities, but also of social needs.

Fixes are available

As a range of organisations led by SFLC.in have pointed out, WhatsApp allows people to be added to groups without their knowledge or consent. This is a bug in the platform that causes increased social risk, because socially inflammatory messaging is easily spread by adding people to groups formed for the purpose of incitement. These are groups that these people would probably never join on their own.

Fixing this will reduce the risks. We shouldn’t need to build a system of intrusive regulation to inspire platform companies — whose every choice of feature or implementation affects the lives of billions of people — to take more care of the consequences of their programme’s behaviour. The fixes are available if we stop the blame game and work together.

Mishi Choudhary is the founder and legal director at the Software Freedom Law Center

Recommended for you