Timeline
- The Government published an Online Harms White Paper setting out their plans for a package of online safety measures and announcing a consultation with stakeholders on its contents. The consultation began on 8 April 2019 and closed on 1 July 2019.
- The Government published their initial consultation response to the Online Harms White Paper consultation on 12 February 2020. In their response, they granted further powers to Ofcom to regulate the social media world.
- The Government announced a call for evidence on the impact of loot boxes in video games, to examine concerns they may encourage or lead to problem gambling. The call for evidence closed on 22 November 2020.
- On 15 December 2020, the Government published their full response to the Online Harms White Paper.
- In May 2021, the Online Safety Bill was included in the Queen’s Speech of 11 May 2021. The draft Bill was published the day after, with Explanatory Notes, an Impact Assessment and a Delegated Powers Memorandum.
- In July 2021, the Joint Committee on the draft Bill was established to scrutinise the draft Bill.
- The Committee’s report was published on 14 December 2021.
- On 4 February 2022, the Government announced the Online Safety Bill will be strengthened with an updated list of criminal content for tech firms to remove as a priority.
- On 25 February 2022, the Government announced social media platforms will now be legally required to give users the power to block unverified users and opt-out of seeing harmful.
- On 9 March 2022, the DCMS announced that category 1 service providers and search services would have a duty to prevent the publication of paid-for fraudulent adverts. The Government also announced a Consultation on the Online Advertising Programme.
- On 14 March 2022, the DCMS announced that the Bill would create a new criminal offence relating to cyberflashing.
- On 17 March 2023, the Government introduced the Online Safety Bill to Parliament.
- On 30 June 2023, the Government announced key changes to the Online Safety Bill, bolstering protections for children, empowering adults to control the content they see, facilitating access to social media data for coroners and bereaved parents, and requiring research on harms arising from app stores.
The Government's full response to the Online Harms White Paper
- On 15 December 2020, the Government published their full response to the Online Harms White Paper.
- TIGA have published a summary of the key aspects of the response relevant to the video games industry.
- The Government have published a fact sheet, answering questions surrounding the Government Response.
- Oliver Dowden’s Ministerial statement to the House of Commons, announcing the publication of the Government’s response, can be accessed here.
- The Chair of the DCMS Select Committee, Julian Knight MP, has commented on the contents of the Government’s response. His comments can be accessed here.
- Ofcom have released a statement outlining their role in the new Online Harms framework, which can be read in full here.
New report on Online Harms published by MPs and Peers
- A new report has been published by a group of MPs and Peers.
- The Joint Committee on the draft Online Safety Bill, chaired by Damian Collins MP, has recommended major changes to the Online Safety Bill.
- The main conclusions from the report can be found here.
- The Committee believes the Bill should be clearer about what is specifically illegal online and it should not be up to the tech companies to determine this.
- The Committee agrees with the Law Commission’s recommendations about adding new criminal offences to the Bill. They recommend that Cyberflashing, (deliberately sending flashing images to people with photosensitive epilepsy with the intention of inducing a seizure) be made illegal.
- Pornography sites will have legal duties to keep children off them regardless of whether they host user-to-user content.
- Content or activity promoting self-harm be made illegal, such as it already is for suicide.
- Further to this, the report recommends that individual users should be able to make complaints to an ombudsman when platforms fail to comply with the new law.
- They also recommended that a senior manager at the board level or reporting to the board should be designated the “Safety Controller” who would be made liable for the failure to comply with their obligations as regulated service providers when there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users.
New criminal offences added to the Online Safety Bill
- On 4 February 2022, the Government announced the Online Safety Bill will be strengthened with an updated list of criminal content for tech firms to remove as a priority.
- The list of new criminal offences will be added to the bill, these include online drug and weapons dealing, people smuggling, revenge porn, fraud, promoting suicide and inciting or controlling prostitution for gain.
- Previously firms would need to take such content down once it had been reported to them by users. Now firms will need to be more proactive and prevent people from being exposed to such content in the first place by ensuring the functionalities and features of their services are designed to prevent this content from being available.
- Ofcom will be able to take faster enforcement actions against firms that fail to remove such content, they will be able to issue fines of up to 10 per cent of annual worldwide turnover to non-compliant sites and block them from being accessible in the UK.
New measures announced
- On Friday 25 February 2022, the Government announced two additional duties on category 1 companies.
- Social media platforms will now be legally required to give users the power to block unverified users and opt out of seeing harmful.
- The largest social media companies like Twitter and Facebook will be legally required to offer ways for users to verify their own identities and control who can interact with them, including being able to block anyone who has not verified their identity or anonymous accounts.
- Similarly, such platforms will be obliged to make tools available for their users which will allow them to choose whether or not they see certain content that is not illegal but still causes significant harm, such as racist abuse, the promotion of self-harm, eating disorders and anti-vaccine disinformation.
Online Safety Bill presented to Parliament
- On Thursday 17 March 2022, the Bill was presented to Parliament by the Government.
- The Bill was given its First Reading on Thursday 17 March 2022. This stage is formal and takes place without any debate.
- MPs will next consider the Bill at Second Reading on Tuesday 19 April 2022.
New amendments announced
On 30 June 2023, the government announced key changes to the Online Safety Bill, bolstering protections for children, empowering adults to control the content they see, facilitating access to social media data for coroners and bereaved parents, and requiring research on harms arising from app stores.
- Strengthened protection for children: The amendments will provide better protection for children by preventing their access to content promoting suicide, self-harm, or eating disorders. It also mentions that top tech executives will be held personally responsible for keeping children safe on their platforms.
- Easier access to social media data: The changes will make it easier for coroners and bereaved parents to access data from social media platforms, allowing them to understand if online activity contributed to a death. It emphasizes the collaboration with the Bereaved Parents for Online Safety group and Baroness Kidron.
- Research on app store harms: There is a new measure that requires Ofcom to conduct research into the harms arising from app stores. This research will aim to address the risks associated with children accessing harmful content through app stores.
- Empowering adults to control content: The amendments will empower adults to take control of the content they see on the largest platforms. It specifies that platforms will be required to proactively offer tools to adult users to avoid content related to self-harm, eating disorders, and abusive content based on race or religion.
- Guidance for reducing harm to women and girls: There is a new requirement for Ofcom to publish guidance summarizing measures that services can take to reduce the risk of harm to women and girls. It emphasizes the involvement of the Domestic Abuse Commissioner and Victims Commissioner in producing this guidance.