On Wednesday 12 February 2020, the UK Government announced that new powers will be given to watchdog Ofcom to force social media firms to act over harmful content online.
Ofcom already regulates television and radio broadcasters, including the BBC, and deals with complaints about them. Ofcom will now be granted extended powers to make technology companies responsible for protecting adults and children from harmful content, including violence, terrorism, cyber-bullying and child abuse. They will also be expected to “minimize the risks” of harmful content appearing at all.
This is the Government’s first response to the Online Harms consultation 2019. The Government will set the direction of the policy, but will give Ofcom the freedom to devise and adapt the details so that they can tackle emerging threats without requiring new legislation.
Further details are expected to be announced in due course.
Dr Richard Wilson OBE, CEO of TIGA has said:
“TIGA supports the intention for an independent regulator to oversee a duty of care online, issue codes of practice and to enforce compliance.
“Social media is the leading source for where online harm takes place in the UK.[1] However, it is important that video game businesses that are within the scope of the regulatory framework fulfil their duty of care and minimise the potential for online harms.
The UK’s video games development and digital publishing sector overwhelmingly consists of SMEs, with 66 per cent employing four or fewer people. It is critically important that the new regulatory framework protects online users whilst ensuring that the UK is the best place to start and grow a digital business.”