How game developers can use privacy-by-design to conform with the Children’s code.

By July 25, 2022 July 26th, 2022 Uncategorized

This is a guest post produced for TIGA by the Information Commissioner’s Office (ICO). For more information on the ICO, please visit their website.

The Children’s code (known formally as the Age-appropriate design code) came into force on 2 September 2021. The Information Commissioner’s Office (ICO) has responsibility for enforcing this code. To support the games industry, they have produced these best practice recommendations, which give game developers examples of practical ways they can meet the code’s standards. These include key findings to date from their supervision of the code, including key improvements that are needed across the games sector.

  • Complete a data protection impact assessment (DPIA) to understand what personal data your game is collecting and how it processes the data

The ICO notes that the games sector needs to undertake further work to complete DPIAs as a matter of urgent regulatory conformance. Standard two requires organisations to complete a DPIA to identify and minimise the data protection risks of their service. So far, around half of games companies either do not have a DPIA or are in the process of completing one. The completed DPIAs often didn’t sufficiently address the risks posed to children and how to mitigate these risks to ensure the game supports a child’s best interests. We suggest you should make a data map to inform your DPIA.

For example, your data map may identify that the game keeps gameplay or operational data alongside personal data. Your DPIA may identify risks with this. For example, there is a risk of deleting the gameplay or operational data by mistake if a user requests you delete their data. To mitigate such risks, you may follow best practice and separate the types of data which you store. This means your developers do not lose gameplay data when removing personal data. It also means you can comply with the user’s request to delete their data.

Your data map and DPIA could highlight where session user IDs can minimise the personal data you collect. This ensures you could not identify users across sessions or datasets, provided there is no other permanent identifier. Similarly, when undertaking segment or cohort analysis, best practice is to do this on average rather than individual behaviour. This provides sufficient detail for analysis without needing to analyse data on an individual level.

Games companies may wish to undertake a children’s rights impact assessment (CRIA) before completing a DPIA. The UK’s Children’s Commissioners recommend CRIAs, and they are a useful way of assessing how your service relates to all the code’s standards. They can also enable your game to support children’s best interests.

Together, these help you to minimise the processing of personal data (standard eight). The ICO worked with Fundamentally Games to develop a sample DPIA for games companies to use.

  • Assess the level of risk your game poses to children

The best interests of the child should be a primary consideration when you design and develop games likely to be accessed by a child. The ICO defines a child as anyone under 18. You need to assess the level of risk your game poses to children. This applies whether you designed your game specifically for children, or if you have evidence that children are likely to want to play it. When processing their personal data, consider the needs of child users and then support those needs in the game’s design. This includes the set-up process and ongoing management. For example, your game’s monetisation model needs to be transparent and consistent. This allows children or parents to make an informed purchase.

The ICO has best interests guidance and a best interests framework to help you. You should also consult with children and parents. This could take the form of:

  • child user surveys;
  • engagement with youth panels; or
  • participatory or co-designed research with children.

If this isn’t feasible, consult child advocates, schools or previous research on children’s views.

You should not use children’s personal data in ways that are detrimental to their wellbeing. For example, detrimental use could include using personal data to drive the game. This includes acquisition, monetisation, retention or nudges to extend gameplay. You should not allow use of personal data in ways that could enable bullying or abusive behaviour. This could include:

  • unwanted contact online;
  • trolling;
  • pretending to be someone else; or
  • providing access to age inappropriate content or products.​

In particular, multi-player games need to consider the best interests of the child. For example, you should think about design features like playing in teams, having team chat functions or accepting friend requests. If teams are randomly set up, children could be in teams that are not age appropriate, which could lead to harm. Best practice is to ensure teams are age appropriate and have controls in place for users to decide who can ‘friend request’ them, or for parental controls that have oversight of this.

  • Use appropriate age ratings and stick to policies and community guidelines.

Publish your terms and conditions and policies and, importantly, adhere to these (standard six). Best practice is for these to cover age ratings, restrictions, behaviour and content policies. These should be available to users and non-users and you should write them in a way that users can easily understand. You should also adhere to any PEGI rating or industry code of practice that you say you adhere to.

Implement mechanisms to uphold your published standards and policies and test that these are effective. Ensure that all users adhere to the standards.

Best practice includes allowing users to play your game for a limited time without collecting personal data or requiring them to accept any terms. This helps children understand what the game is and make a more informed choice about the processing of their personal data before they agree to terms of service. To support children’s best interests, the game available for a limited time should be appropriate for all children. Alternatively, it could only process the child’s age to let them try the game with the features available to their age group.

Terms of service must correspond to the actual users of the service. If your terms of service require users to be over a certain age, you need to prevent users under that age accessing your service. To do this, games companies need to undertake better analysis of their actual users, to understand if their terms of service are being upheld.

  • Provide an age-appropriate service or implement sufficient measures to assure age

You should not offer accounts to under-18s if your service is not appropriate for them to use. You can either establish age to a reasonable level of certainty or apply the standards in the code to all users (standard three). You need to take a risk-based approach to recognising the age of your users. Your DPIA helps you to do this. You may wish to take into account factors such as:

  • the types of data collected;
  • the volume of data;
  • the intrusiveness of any profiling;
  • whether decision-making or other actions follow from profiling; and
  • whether the data is being shared with third parties.

Our supervision of the games sector’s conformance with the code has highlighted a reliance on self-declaration for age assurance. However, this is often not sufficient for the risks the games pose. Self-declaration is easily circumvented, such as children putting in a fake age or re-entering an age until they gain access. If you aren’t applying all standards to all users, you must:

  • implement a method of age assurance that is sufficiently robust for the risks your game poses to children; or
  • you can provide a low-risk version of your game for all users. The additional features are only available to users that can demonstrate they are old enough to access these, in line with your terms and conditions.

You can find more information, including on assessing risk, in the ICO’s Opinion on the use of age assurance.

  • All settings must be ‘high privacy’ by default, unless you can demonstrate a reason not to​.

You should disable social features that fall under the Children’s code prior to opt-in. Social features can include chat function or engaging with other users, such as commenting, liking or reacting to their posts or progress. The features must also have appropriate moderation. We suggest having very limited social features for under 13s, unless you have a compelling reason. For example, we recommend avoiding free text messaging unless strictly necessary for a beneficial purpose.

Profiling should be ‘off’ by default. You should only use profiling if you have measures in place to protect children from harmful effects or content that is detrimental to their health and wellbeing. If profiling is required, you must be able to demonstrate why it is required and why this is necessary.

You should not use profiling for advertising unless the child opts to adjust the profile setting. We do not consider advertising to be an essential purpose. For children under 13, this requires parental approval. If advertisements are important for the game’s monetisation and sustainability strategy, consider using contextual advertising that does not rely on behavioural data profiling. This is particularly relevant for the use of free-to-play games, subscription services, in-game purchases, loot boxes or competitions through external organisations, such as social media companies. If using advertising providers, games companies should undertake due diligence to ensure the provider’s advertisements are age-appropriate. You must ensure that any advertisements in your game uphold your policies and community standards. You should refer to the Committee of Advertising Practice for further guidance on advertising in-game purchases.

  • Don’t use nudge techniques to encourage children to provide unnecessary personal data or to lower or turn off their privacy protections.

Games companies need to disable any nudge techniques that encourage children to provide additional personal data or to lower or turn off their privacy protections. Otherwise, it is unlikely their games conform with standard 13 of the code. Nudge techniques include:

  • preventing users from saving their game until they reach a certain level;
  • encouraging users to sign up for other services, such as through competitions for extra in-game features via external organisations; or
  • nudging users to set up profiles which include personal data.

Best practice is to encourage children to leave the game regularly through natural breaks in gameplay. You should consider what constitutes reasonable timeframes for breaks between and within sessions and ways to explicitly provide break opportunities.

A best practice recommendation for creating profiles is to limit this to first names or use pre-generated usernames. You can also use a cartoon or animation rather than a personal picture or avatar.

  • Clearly define and understand what role each organisation involved in the creation, distribution and running of the game has and what data they should have.

This is crucial to conform with the data sharing standard. Your DPIA should help you to identify the various relationships.

Do not share children’s personal data unless necessary for a specific purpose, considering their best interests. Examples of necessary sharing include with the third parties who deliver services in your game. This applies as long as these third parties are a core part of delivery and they require the personal data to deliver the services.

Third-party platforms may collect personal data to provide their element of your service, such as payment. They shouldn’t share the data with you unless it is necessary for a specific purpose.

We have seen a number of games companies sharing personal data with social media companies, for example through account sign-ins. Data sharing must be off by default unless you can demonstrate a compelling reason for why you need it.

Best practice includes using responsible ad providers that use age appropriate content and mechanisms. This means they:

  • do not profile to target ads to children;
  • do not have direct calls for action from their ads; and
  • clearly differentiate their content as an ad within your game to ensure transparency.

Many games companies actively promote and advertise the use of external forums, such as chat rooms. They are advertised as enhancing the user’s experience by interacting with others. You should consider whether it’s in the child’s best interests to encourage using external forums and how a third party would process a child’s data.

  • Support your team to handle personal data appropriately.

You should train designers and developers on how to handle children’s personal data. Job descriptions should clearly define responsibilities. Any training should include how to comply with data protection legislation and conform with the Children’s code. The ICO has developed award-winning design guidance which designers can use to apply some of the standards of the code in practice. Best practice is to use this guidance when designing new features or games.

  • Ensure your users know what personal data you’re collecting and how you’re using it.

Give children age-appropriate information, including about parental controls. You need to demonstrate that child users can understand the information you’re providing them. You should offer specific privacy information for each game. If using one privacy policy for multiple games, it should be clear and easy for children to understand which parts are relevant to the game they are playing. Undertake user testing to ensure your transparency information is clear and appropriate to the developmental age of the children using your game.

Ensure users understand that device sensors and areas of the game, for example their microphone, gyroscope and live chat functionality, can collect personal data.. Best practice is to give children separate choices over which elements of your game they wish to activate. Providing additional explanations at points where you use their personal data is also best practice.

Be transparent who you share children’s data with, and why – for example, third-party log-in services. Avoid entering data sharing arrangements where third parties then share personal data with their own partners. If you do this, children may not reasonably know who is processing their personal data. If using third-party log-ins, best practice is to provide a log in that does not require a social media account. This minimises data sharing and prevents users needing to create additional accounts.

Our right to be informed guidance provides further information on how you can provide privacy information.

  • Implement effective parental controls.

If your game allows parental monitoring, make it obvious to the child when this is happening. This is fundamental to meeting standard four (transparency) and standard 11 (parental controls).

We’ve seen games companies focus on parental controls for users under 13. However, this can mean that there is limited-to-no oversight of 13-17 year old users. Games developers should ensure there are parental controls that are appropriate for each age group. These can gradually reduce as children approach adulthood, rather than a step change at 13.

Best practice includes actively promoting parental control options at various stages of game use such as during the sales process, providing product information and issuing ongoing messaging. The ICO recommends having an option of making a report available to parents to view things like game usage, friend requests and purchases.

  • Allow anyone to report concerns.

Best practice for standard 15 includes providing prominent and simple tools to help children understand and exercise their data protection rights and report concerns. It is also best practice to allow users and non-users to report concerns. Ensure you can respond to user reports quickly, and moderate any rules written in your community standards policies.

In summary

  1. Undertake a DPIA and take responsibility for the personal data collected by all the systems supporting your game.
  2. Understand that personal data means anything attributable in any way to a user, even indirectly. Train staff to handle personal data safely.
  3. Use parental controls that are appropriate to the child’s age and clearly tell children when these are active.
  4. Only collect personal data you actually need and do not nudge children to provide additional personal data. If using consent as your lawful basis for processing, ensure you have the appropriate consent, considering that children under 13 require parental consent.
  5. Use appropriate age rating systems and ensure you and your users abide by the rules set by your terms of service and community guidelines. Allow anyone to report concerns.

Design games with the best interest of the child in mind where under 18s are likely to access the service. Consider how your game conforms with all relevant standards of the Children’s code and be transparent about how you use and share children’s personal data. This helps you conform with the code and increases your users’ confidence in your service

Cookies

We use cookies on our site to track activity and visitor numbers - please help us by allowing us to use them on your visit.