How will new ‘age appropriate’ code protect children online?
'I believe that it will be transformational,' says information commissioner
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.The Information Commissioner’s Office (ICO), an independent regulatory office that aims to “uphold information rights in the public interest”, has published a new Age Appropriate Design Code to protect children online.
The code, the draft of which was first reported in April 2019, is scheduled to come into force by autumn 2021.
It consists of 15 measures that it states will provide better protection for young people when they are spending time online, whether they are using apps, browsing social media platforms or playing online games.
Elizabeth Denham, information commissioner, told the PA News Agency that she believes the implementation of the code will be “transformational”.
“I think in a generation from now when my grandchildren have children they will be astonished to think that we ever didn’t protect kids online,” Ms Denham said.
“I think it will be as ordinary as keeping children safe by putting on a seat belt.”
Ms Denham added that while GDPR “requires special treatment of children”, the 15 standards outlined by the new code “will bring about greater consistency and a base level of protection in the design and implementation of games and apps and websites and social media”.
Andy Burrows, head of child safety online policy at the NSPCC, said the code will require tech companies to “assess their sites for sexual abuse risks” and prevent them from permitting “harmful self-harm and pro-suicide content” on their sites for the first time.
“It is now key that these measures are enforced in a proportionate and targeted way,” Mr Burrows stated.
Here are the 15 measures that are being put into place as part of the ICO’s Age Appropriate Design Code:
1. Best interests of the child
In accordance with the United Nations Convention on the Rights of the Child, the age appropriate code emphasises that the “best interests of the child should be a primary consideration”.
2. Data protection impact assessments
This measure outlines that data protection impact assessments must be undertaken by tech firms in oder to “identify and minimise the data protection risks of your service – and in particular the specific risks to children who are likely to access your service which arise from your processing of their personal data”.
3. Age-appropriate application
The ICO states that online companies must take the age range of their users into the account, explaining that assessing the individual needs of children at various stages of development “should be at the heart of how you design your service and apply this code”.
4. Transparency
Transparency, the code states, “is about being clear, open and honest with your users about what they can expect when they access your online service”.
The ICO adds that acting in a transparent manner is already outlined as part of GDPR, explaining that it is essential when processing people’s personal data.
5. Detrimental use of data
According to the Age Appropriate Design Code, it is important for companies to refrain from using data “that is obviously detrimental to children’s physical or mental health and wellbeing or that goes against industry codes of practice".
6. Policies and community standards
This measure stated that when a tech firm has already published community rules and conditions, it is vital that they stick to these regulations.
“Keeping to your own standards should also benefit you by giving children and their parents confidence that they can trust your online service with their personal data,” the ICO says.
7. Default settings
The code states that the default privacy settings implemented by tech companies should be set at an “appropriate” manner for children.
8. Data minimisation
Data minimisation, the ICO explains, means “collecting the minimum amount of personal data that you need to deliver an individual element of your service”.
“It means you cannot collect more data than you need to provide the elements of a service the child actually wants to use,” the organisation adds.
9. Data sharing
The code outlines that taking data sharing into consideration is especially important when it comes to children, as sharing children’s personal data could put them at risk.
“The best interests of the child should be a primary consideration for you whenever you contemplate sharing children’s personal data,” it states.
10. Geolocation
The ICO stresses that the use of children’s geolocation data is “of a particular concern”, as having access to the location of a child could pose a threat to their “physical safety”.
“In short it can make children vulnerable to risks such as abduction, physical and mental abuse, sexual abuse and trafficking,” the office writes.
11. Parental controls
The ICO explains that is an online company utilises parental controls, then the child should be made aware of the controls that are in place to regulate their online activity.
12. Profiling
The code says that profiling – which is “any form of automated processing of personal data consisting of the use of person data to evaluate certain aspects relating to a natural person” – should only be permitted if a company has enforced “appropriate measures” to protect child users.
13. Nudge techniques
Nudge techniques, the ICO explains, are online cues which influence how a user may use a website, such as by encouraging them to click large, colourful buttons.
The organisation states that nudge techniques could be used to encourage children to “select less privacy-enhancing choices when personalising their privacy settings”, thus putting them and their personal data at greater risk.
14. Connected toys and devices
Some children’s toys and devices are designed to be able to connect to the internet, a feature that the ICO says raises “particular issues” due to “their scope for collecting and processing personal data”.
15.Online tools
Online tools are “mechanisms to help children exercise their rights simply and easily”, the code outlines.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments