ICO Age Appropriate Design Code: TIGA’s Summary

By August 9, 2019 Industry News

On 12 April 2019, the Information Commissioner’s Office (ICO) published its draft ‘Age appropriate design: a code of practice for online services‘. The Code sets out 16 design standards that the Commissioner will expect providers of online ‘Information Society Services’ (ISS) to meet. The Code applies if your service processes personal data and if your service is likely to be accessed by children. For the purposes of the Code, a child is defined as a person under 18 years of age.

The scope of the Code includes apps, programs, websites, games or community environments, and connected toys or devices with or without a screen. The draft Code contains practical guidance on 16 standards of age-appropriate design.

Providers who fail to act in accordance with a provision of this Code may invite regulatory action and will find it difficult to demonstrate compliance with the law. In particular, the Commissioner will take the Code into account when considering the use of her enforcement powers. The Code can also be used in evidence in court proceedings, and the courts must take its provisions into account wherever relevant.

Tools at ICO’s disposal include assessment notices, warnings, reprimands, enforcement notices and penalty notices (administrative fines). For serious breaches of the data protection principles, the ICO has the power to issue fines of up to €20 million or 4% of a provider’s annual worldwide turnover, whichever is higher.

Summary of the Code’s Standards

  1. Best interests of the child: The best interests of the child should be a primary consideration when you design and develop online services likely to be accessed by a child.

Amongst other things, businesses will need to consider how to:

  • keep children safe from exploitation, including commercial and sexual exploitation;
  • protect and support their health and wellbeing;
  • protect and support their physical, psychological and emotional development.
  1. Age-appropriate application: Consider the age range of your audience and the needs of children of different ages. Apply the standards in this code to all users, unless you have robust age-verification mechanisms to distinguish adults from children.


  1. Transparency: The privacy information you provide to users, and other published terms, policies and community standards, must be concise, prominent and in clear language suited to the age of the child. Provide additional specific ‘bite-sized’ explanations about how you use personal data at the point that use is activated.


  1. Detrimental use of data: Do not use children’s personal data in ways that have been shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions or Government advice.

ISS providers should not use children’s personal data in ways that have been shown to be detrimental to their wellbeing e.g. strategies that extend user engagement (i.e. reward loops, continuous scrolling, features that encourage continuous play, notifications). Businesses should introduce mechanisms such as pause buttons which allow children to take breaks without losing progress in a game.


  1. Policies and community standards: Uphold your own published terms, policies and community standards (including but not limited to privacy policies, age restriction, behaviour rules and content policies).

If ISS providers have any published rules which govern the behaviour of users of your service then you need to uphold those rules and put in place the systems that you have said you will. For example, if you say that you will not tolerate bullying then you need to have adequate mechanisms in place to swiftly and effectively deal with bullying incidents.  Relying on self-reporting of incidences may not be enough. You may need to actively employ monitors and technologies to protect against harms.


  1. Default settings: Settings must be ‘high privacy’ by default (unless you can demonstrate a compelling reason for a different default setting, taking account of the best interests of the child).


  1. Data minimisation: Collect and retain only the minimum amount of personal data you need to provide the elements of your service in which a child is actively and knowingly engaged. Give children separate choices over which elements they wish to activate.


  1. Data sharing: Do not disclose children’s data unless you can demonstrate a compelling reason to do so, taking account of the best interests of the child. Age-appropriate design code Summary of the standards.


  1. Geolocation: Switch geolocation options off by default (unless you can demonstrate a compelling reason for geolocation, taking account of the best interests of the child), and provide an obvious sign for children when location tracking is active. Options which make a child’s location visible to others must default back to off at the end of each session.


  1. Parental controls: If you provide parental controls, give the child age appropriate information about this. If your online service allows a parent or carer to monitor their child’s online activity or track their location, provide an obvious sign to the child when they are being monitored.


  1. Profiling: Switch options which use profiling off by default (unless you can demonstrate a compelling reason for profiling, taking account of the best interests of the child). Only allow profiling if you have appropriate measures in place to protect the child from any harmful effects (in particular, being fed content that is detrimental to their health or wellbeing).

Content or behaviours that may be detrimental to children’s health and wellbeing include: user generated content and strategies used to extend user engagement, such as timed notifications that respond to inactivity.  If a provider cannot put measures in place to protect against any possible harmful effects, then they should not profile children.


  1. Nudge techniques: Do not use nudge techniques to lead or encourage children to provide unnecessary personal data, weaken or turn off their privacy protections, or extend their use.

Providers should not use nudge techniques that lead children to make poor privacy decisions or use reward loops or similar techniques that exploit human susceptibility to reward/pleasures seeking behaviours in order to keep children engaged in their game.

However, providers should use nudge techniques towards wellbeing enhancing behaviours (such as taking breaks). Providers should also provide tools to support wellbeing enhancing behaviours (such as mid-level pause and save features).


  1. Connected toys and devices: If you provide a connected toy or device ensure you include effective tools to enable compliance with this code.


  1. Online tools: Provide prominent and accessible tools to help children exercise their data protection rights and report concerns.


  1. Data protection impact assessments: Undertake a DPIA specifically to assess and mitigate risks to children who are likely to access your service, taking into account differing ages, capacities and development needs. Ensure that your DPIA builds in compliance with this code.


  1. Governance and accountability: Ensure you have policies and procedures in place which demonstrate how you comply with data protection obligations, including data protection training for all staff involved in the design and development of online services likely to be accessed by children. Ensure that your policies, procedures and terms of service demonstrate compliance with the provisions of this code.


What happens next?

As it stands, the Code is in draft form and subject to change. Earlier this year the ICO held a consultation on the draft Code, providing an opportunity for stakeholders to offer feedback. TIGA has engaged with the ICO and made representations to ICO officials in a meeting in July 2019. The final version of the Code will be delivered to Parliament ahead of the ICO’s statutory deadline of 23 November 2019. It will then be subject to Parliamentary approval before coming into force.

In response to stakeholder feedback, Commissioner Elizabeth Denham, announced:

  • The Code will set out rules on how data can be used and the importance of protecting children, helping designers and developers understand what is expected of them.
  • The ICO understand that delivering the standards set out in the Code will bring challenges for the tech, e-gaming and interactive entertainment industries causing a shift in the design processes for online services which make greatest use of children’s data.
  • Following stakeholder feedback the ICO are aware that expectations under some standards need to be clearer.
  • The ICO do not want to see an age-gated internet, but instead, it wants providers to set their privacy settings to ‘high’ as a default, and to have strategies in place for how children’s data is handled.
  • The law allows for a transition period of up to one year and the ICO will be considering the most appropriate approach for implementing this transition period.
  • The ICO are preparing a significant package to ensure that organisations are supported through any transition period, including help and advice for designers and engineers

More information from the Information Commissioner’s response to stakeholder feedback can be found here.



We use cookies on our site to track activity and visitor numbers - please help us by allowing us to use them on your visit.