Acceptable behaviour standards. Make explicitly clear the standard of behaviour required and behaviour that will not be accepted.
Prevent illegal behaviour. Take additional measures to prevent the furtherance of illegal behaviour that has been recognized as harmful and prevalent in online communities. These harms include child sexual exploitation and abuse, terrorist activity, organized immigration crime, extreme and revenge pornography, harassment and cyberstalking, hate crimes, encouraging or assisting suicide, incitement of violent, sale of illegal goods/services, accessing content illegally uploaded from prisons and the distribution of indecent images by under 18s.
Prevent online harms. Also take measures to prevent the furtherance of other online harms, including cyberbullying and trolling, extremist content and activity, coercive behaviour, intimidation, disinformation, violent content, advocacy of self-harm and the promotion of female genital mutilation.
Clear complaints system. Provide clear, effective, easily accessible complaints and reporting procedures and tools for players to use and protect themselves online.
Player management system. Set up proportionate systems to manage players’ behaviour online, including appropriate systems, procedures, technologies and investment, including in staffing, training and support of human moderators.
Assist law enforcement. Comply with any requests by law enforcement to assist with specific monitoring, for example, where a specific threat to the safety of children has been identified and support investigations to bring criminals who break the law in online games to justice. (Nevertheless, for the avoidance of doubt, games businesses are not expected to undertake general monitoring of all communications on their online services, as this would be a disproportionate burden on companies and would raise privacy concerns.)
Effective user reporting. Take prompt, transparent and effective action following user reporting, including by imposing proportionate sanctions on players who breach behaviour policies in an appropriate timeframe.
Safety technology. Purchase or develop safety technologies to reduce the burden on users to stay safe online and assist with identifying, flagging, blocking or removing illegal or harmful content.
Records of harmful content. Keep appropriate records of reports of illegal and harmful content and behaviour, including the number of reports received, how many of those reports led to action and what the action taken was.
Support for users. Provide information to users who have suffered online harm about appropriate sources of support.
Review efforts to tackle online harms. Regularly review efforts in tackling online harms and adapt online processes to drive continuous improvement.
User protection from harm by design. Include features of game design to minimise harassment, ‘griefing’, ‘trolling’ and other undesirable online behaviours, for example, by enabling players to mute, make invisible and not be impeded by the avatars of those who are harassing them (while respecting the limitations of a player vs player environment).