Apps for children must offer privacy by default

Apps for children must offer privacy by default
Written by Chief Editor

Girl on a tablet

copyright of the imageGetty Images

image captionThe information commissioner said it was “surprising” that these protections weren’t already built into websites

Apps, social media platforms and online games specifically targeting children will now have to put privacy at the heart of their design.

A code of conduct has gone into effect outlining how to protect children’s data and businesses have 12 months to comply with the new rules.

If not, they could face huge fines imposed by the Information Commissioner’s office.

Some wondered if the code would bring about a real change.

Information Commissioner Elizabeth Denham said this is an important step towards protecting children online.

“In a generation we will all be amazed that there has never been a time when there is no specific regulation to protect children online. It will be as normal as fastening your seat belt.

“This code makes it clear that children are not like adults online and that their data needs more protection.”

He said the Information Commissioner’s Office (ICO) has recognized that it may be difficult for small businesses to comply with the code and will offer “help and support” in the coming year.

Among the principles of the code are:

  • the best interests of the child should be a primary consideration in the design and development of online services
  • By default, high privacy levels must be set
  • only a minimal amount of personal data should be collected and stored
  • children’s data should not be shared unless there is a compelling reason to do so
  • children’s personal data should not be used in ways that could be harmful to their well-being
  • geolocation should be turned off by default

Others that must comply with the code include educational websites, streaming services that use, analyze and profile children’s data, and related toy manufacturers.

‘Well intentioned’

The ICO has the power to fine companies up to 4% of their global turnover if they violate data protection guidelines. The organization has already said it will take tougher action when it sees harm to children.

In September last year, YouTube was fined $ 170 million (£ 139 million) for collecting data on children under 13 without their parents’ consent, following a U.S. Federal Trade Commission investigation. .

The scope of protections needed for children online was huge, and the ICO may not be up to the job, said Jen Persson, a digital rights activist.

“The code is well-intentioned and, if applied, could lead to some more targeted changes in the approach of some apps and platforms to stop collecting excessive data from children, for example, and start meeting the requirements of the child protection law. basic data in place for over 20 years.

“The main risks are that, since the ICO has not enforced on behalf of children to date in its current mandate of concrete data protection law, it could be seen as not having the ability to enforce those new things in the code. that go beyond and are subjective, such as the best interests of the child, or that exceed the knowledge and technical skills of the ICO “.

Andy Burrows, head of online child safety policy at the NSPCC, said he hoped the code would force people to rethink the content provided to children.

“Tech companies have a year to prepare for this transformation code that will force them to take damage seriously online, so there can be no more excuses to put kids at risk.

“For the first time, high-risk social networks will have a legal duty to rate their sites for the risks of sexual abuse and no longer offer harmful self-harm and suicide content to children.”

Related topics

  • Data protection

  • Internet
  • Gaming
  • Office of the Information Commissioner
  • App

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.