Explained | How will the U.K.’s Children’s Code impact digital space norms?

What are the threats to children online? Will children in India benefit from the Code?

September 12, 2021 04:15 am | Updated December 04, 2021 10:31 pm IST

Bounds set: The regulations will make using the digital space safer for children, especially while accessing online services.

Bounds set: The regulations will make using the digital space safer for children, especially while accessing online services.

The story so far: Last week the U.K. government brought into effect the Age Appropriate Design Code or the Children’s Code , as an amendment to the Data Protection Act, 2018 , operationalising a set of regulations that will make using the digital space safer for children. While the Code is officially in place only in the U.K., tech majors such as TikTok, Instagram and YouTube have tightened safety rules for children, and campaigners hope this will become the norm globally.

What is the Children’s Code?

The Children’s Code is a data protection code of practice for online services likely to be accessed by children. As 5Rights Foundation , which spearheaded the movement, said, “It has the potential to completely transform the way that companies collect, share and use children’s data, requiring them to offer children a high level of privacy protection by default.” It sets out 15 standards for online services, including in apps, games, toy and devices and even news services. Unless the service provider is able to prove that children do not access the service at all, it is required to consider making changes as per the Code.

 

What are the threats to children online?

Research conducted by 5Rights and Revealing Reality pointed out that within 24 hours of a social media profile being created, children were being targeted with graphic content. It established the pathways between the design of digital services and the risks children face online. According to 5Rights, “It shows that services such as Facebook, Instagram and TikTok are allowing children, some as young as 13 years old, to be directly targeted within 24 hours of creating an account with a stream of harmful content. Despite knowing the children’s age, the companies are enabling unsolicited contact from adult strangers and are recommending damaging content, including material related to eating disorders, extreme diets, self-harm and suicide as well as sexualised imagery and distorted body images.” Further, they concluded that even if the services were not conceived with the intent of putting children at risk, they are by no means ‘bugs’ or mistakes in the code that allow such errors to creep up on children, unbeknownst to the service providers. “These are not ‘bugs’ but features. Revealing Reality interviewed engineers and designers who explained they design to maximise engagement, activity and followers — the three drivers of revenue, not to keep children safe.”

U.K.’s Information Commissioner Elizabeth Denham said, “Data sits at the heart of the digital services children use every day. From the moment a young person opens an app, plays a game or loads a website, data begins to be gathered. Who’s using the service? How are they using it? How frequently? Where from? On what device? That information may then inform techniques used to persuade young people to spend more time using services, to shape the content they are encouraged to engage with, and to tailor the advertisements they see. For all the benefits the digital economy can offer children, we are not currently creating a safe space for them to learn, explore and play.”

Who does the Code apply to?

The Code, according to 5Rights, applies to “information society services likely to be accessed by children”. The definition of an ISS is “any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services”. This includes apps; programs; search engines; social media platforms; online messaging or internet-based voice telephony services; online marketplaces; content streaming services (like video, music or gaming services); online games; news or educational websites; and any websites offering other goods or services to users on the internet. Electronic services for controlling connected toys and other connected devices are also ISS. The code applies to the U.K. based companies and non-U.K. companies that use data of children in the country. However, as has been seen with the example set out by some tech giants, it makes sense to make the entire architecture child-friendly, and not region-specific alone.

 

Will children in India benefit from the Code?

As Ms. Denham says, “It is rooted in the United Nations Convention on the Rights of the Child that recognises the special safeguards children need in all aspects of their life.” John Carr, online safety expert based in the U.K., says on his blog that the UNCRC put in an addition, General Comment 25, which looked at child rights in the context of a digital environment. If tech giants universalise their safety architecture, children across the world will benefit from the Code. However, child rights activists say it is high time that the Indian government incorporated child safety into its social media agenda.

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.