On 7 October 2020, as part of a Westminster Hall debate on online harms, Caroline Dinenage MP, the Minister of State for DCMS, made a statement on the upcoming online harms legislation and what this may consist of.
The key take-home points were as follows:
- The intention to establish a new duty of care for companies towards their users will ensure they have appropriate systems and processes in place to deal with the harmful content on their services and to keep their users safe.
- The Government’s approach will require companies to have clear and accessible mechanisms for users to report harmful content and to challenge it, or take it down where necessary.
- Companies will be expected to enforce their terms and conditions transparently and consistently. The duty of care will be overseen by a regulator, which will have oversight of these mechanisms and strong enforcement powers to deal with non-compliance.
- The Government have also consulted on further powers to carry out things such as business disruption activities, blocking internet service providers and personal sanctions for senior managers. Further information on this will be published in the full Government response.
- The Government will expect companies to have a range of tools to protect children, including measures such as age assurance and age verification technologies. The Government are collaborating with the Home Office, GCHQ and a wide range of stakeholders on research into the verification of children online and considering the technical challenges of knowing who online a child is. They ran a successful technical trial to test the use of age-assurance technologies at scale.
Her full statement can be read below:
Online Harms
Westminster Hall
7 October 2020
The Minister for Digital and Culture (Caroline Dinenage): It is a pleasure to serve under your stewardship, Sir Edward. I thank the hon. Member for Halifax (Holly Lynch) for tabling this incredibly important topic for debate. This is my first opportunity since taking this role in February to speak publicly about online harms, and I am grateful for the chance to do so. I am also grateful to all Members who have taken part in the debate and raised some incredibly important topics. My hon. Friend the Member for Brigg and Goole (Andrew Percy) summed up an important challenge at the beginning: it should not take Government legislation to sort this out, but, unfortunately, it does, now more than ever. That was brought home to me over the summer, when I talked to the father of Molly Russell, a young lady whose story started with online bullying and then led on to her seeking information online as to how to take her own life, which she did. That was a conversation that I never want to have with another parent again. It was utterly chilling. That is why my dedication to making sure the legislation is fit for purpose is stronger than ever.
The hon. Member for Ogmore (Chris Elmore) challenged me to ensure that the legislation is robust, clear and soon, and I take that challenge. I have had a number of other challenges from across the room, and given that I have only a few moments to respond, I will get through as many as I can. Anyone I do not get to, I will write to. As hon. Members know, the Government published the online harms White Paper last year, setting out how to make legislation to make the UK the safest place in the world to be online. User safety is very much at the heart of our approach. The intention to establish a new duty of care for companies towards their users will ensure they have appropriate systems and processes in place to deal with the harmful content on their services and to keep their users safe. Our approach will require companies to have clear and accessible mechanisms for users to report harmful content and to challenge it—take it down, in fact—where necessary. Companies will be expected to enforce their terms and conditions transparently and consistently. The duty of care will be overseen by a regulator, which will have oversight of these mechanisms and strong enforcement powers to deal with non-compliance. The White Paper spoke about some of these powers, but we have also consulted on further powers to carry out things such as business disruption activities, blocking internet service providers and personal sanctions for senior managers. Further information will be published in the full Government response.
Since publishing the public consultation, we published the interim Government response earlier in the year, which shares the findings from the consultation and indicated the direction of travel. We intend to publish the full Government response within the next few weeks and to have the legislation ready early next year. A range of other issues have been raised today, and I will get through as many as I can. The hon. Member for Upper Bann (Carla Lockhart) and many other hon. Members suggested that there might be some watering down of the legislation compared with the White Paper. In fact, the hon. Member for Bristol North West (Darren Jones) thought that it might be part of some of our trade negotiations. That is not the case. There will be no watering down—in fact, the opposite. The protection of children is at the heart of our approach to tackling online harms, and a number of hon. Members have raised that. There is huge recognition that the online world can be particularly damaging for children. We understand that. It is their mental health and their very well-being that are at stake. Our online harms proposals will assume a higher level of protection for children than for the typical adult user.
We will expect companies to have a range of tools to protect them, including measures such as age assurance and age verification technologies to protect them from accessing inappropriate content. My hon. Friend the Member for Congleton (Fiona Bruce) spoke about the Digital Economy Act 2017. This will go further than the focus of the Digital Economy Act. One criticism of that Act was that its scope did not cover social media companies. One of the worst places where a considerable quantity of pornographic material is available to children is on social media. Our new approach will therefore include social media companies and all sites on which there is user-generated contact, including major pornography sites. It is important that we no longer see age verification for pornography in isolation, but as part of this wider package to protect children across a range of sites and harmful materials. This technology is new and emerging, and it is important that we take every opportunity to get at the front end of it. That is why we are collaborating with the Home Office, GCHQ and a wide range of stakeholders on research into the verification of children online and considering the technical challenges of knowing who online is a child. We ran a successful technical trial to test the use of age-assurance technologies at scale. The initial findings have been promising, and I look forward to developing that work shortly. In recent years, there has been a massive rise in online abuse, harassment and intimidation, with a large majority of that on social media. I am clear that any abuse targeted towards anybody is unacceptable, but we heard from many Members that certain users are disproportionately targeted. For example, we know that issues such as revenge porn are rising. The UK Safer Internet Centre recently cited the fact that, this year, the revenge porn helpline has already dealt with 22% more cases than in the whole of 2019. That is not acceptable. We are clear that what is illegal offline should be illegal online, including a number of things raised today, such as incitement to violence and the selling of faulty and potentially hazardous goods.
We need to make sure that social media companies take as much responsibility as they can, but we also need to make sure that law enforcement agencies are equipped to take action where they need to. In some cases, the law is not fit for purpose to deal with the challenges of the online world, as we heard from my right hon. Friend the Member for Basingstoke (Mrs Miller). That is why we instructed the Law Commission to review existing legislation on abusive and harmful communications. It is also undertaking additional reviews, including on the taking, making and sharing of intimate images, which is obviously incredibly upsetting for victims. Given the nature of law making, a patchwork of offences has been developed over time to address this issue. The Law Commission is now considering the best way to address these harms and whether there are any gaps in legislation. We are working alongside it to consider the right legislative vehicle to take this issue forward. Finally, we have seen some horrific examples involving disinformation and misinformation over the covid period, including the burning down of 5G masts because of some horrific conspiracy theories.
We stood up the cross-Whitehall counter-disinformation cell earlier in the year and, to give reassurance to those who asked for it, we have been working since the beginning of the summer with colleagues across Government and with social media companies on how to respond to anti-vax campaigns, so that is very much in hand. As well as calling for action from companies, it is key that users are empowered with the knowledge and skills to keep themselves safe, which is why our online media literacy strategy will come out in partnership with the White Paper. With that, I will end, to leave time for the hon. Member for Halifax to conclude the debate.
The full debate on Online Harms can be viewed here.