BCS Senior Policy Manager Dan Aldridge has prepared a briefing on the Online Safety Bill, which is in its final stages at Westminster. He said:

“With so much occupying parliamentary attention over the past few years, it’s understandable that the Online Safety Bill has taken so long to get to its final reading. Also, the hugely emotive and existential questions it poses can in part explain why the Bill has seen so many iterations over the years, with significant amendments announced just recently (detailed below).

“Despite this lengthy process and some significant mainstream media crossover, this Bill is one of the least understood but potentially most consequential pieces of government legislation, possibly in a generation. For the Online Safety Act to deliver on its intended purposes, its implementation and evaluation must be collaborative.

“The Bill’s Parliamentary stewards often invoke children and young people as being at the heart of this issue, but they have been noticeably absent from any meaningful consultation. How the Act facilitates better digital education, inclusion and access by communities will be a key indicator of its success and impact. We must all be part of making that change, and we look forward to supporting the government in enacting the Online Safety Act.”

Demonising algorithms won’t protect children from social media harm, BCS warns in response to the Online Safety Bill

The overall aim of the Bill is to require online platforms – where users interact with one another – to conform to a duty of care to their users and remove content that is illegal or considered harmful under the new rules, with large fines and the prospect of sites being blocked among the potential penalties.

In March 2022, the Bill returned to Parliament, following a number of the changes to the draft legislation. 

What’s new since the last BCS policy briefing on the Online Safety Bill?

Communications offences

4 February, the DCMS announced that it was accepting the Law Commission's recommendations for a harm-based communications offence, a false communications offence, and a threatening communications offence. These would be brought into law through the Bill.

Priority offences

7 February, the DCMS announced that it would be setting out further priority offences on the face of the Bill (offences relating to terrorism and child sexual abuse and exploitation are already listed).

For you

Be part of something bigger, join BCS, The Chartered Institute for IT.

This was in response to recommendations from the Joint Committee on the draft Bill, the DCMS Committee and the Petitions Committee.

The offences would include incitement to and threats of violence, hate crime, and financial crime. Listing these offences in the Bill would mean that companies would not have to wait for secondary legislation before taking proactive steps to tackle priority illegal content.

Protecting children from pornography

8 February, the DCMS announced that the Bill would be strengthened so that all providers who published or placed pornographic content on their services would need to prevent children from accessing that content. This was in response to concerns that non-user generated pornography was not within the scope of the draft Bill.

Online abuse

25 February, the DCMS announced that, to tackle online abuse, including anonymous abuse, the Bill would impose two additional duties on category 1 service providers (i.e. the largest platforms):

  • a “user verification duty” would require category 1 providers to give adult users an option to verify their identity. Ofcom would publish guidance setting out how companies could fulfil the duty and the verification options that companies could use
  • a “user empowerment tools duty” would require category 1 providers to give adults tools to control who they interacted with and the legal content they could see

The new duties are in response to concerns raised by the Joint Committee on the draft Bill, the DCMS Committee, and the Petitions Committee about the impact of abuse and the need to give users more control over who they interacted with.

Paid-for fraudulent adverts

9 March, the DCMS announced that category 1 service providers and search services would have a duty to prevent the publication of paid-for fraudulent adverts. This was in response to recommendations from the Joint Committee on the draft Bill, the DCMS Committee and others.

The Government also announced a separate Consultation on the Online Advertising Programme. This would complement the Bill and would seek views on improving transparency and accountability across the online advertising supply chain. Regulating online harms 7 Commons Library Research Briefing, 15 March 2022

Cyberflashing

14 March, the DCMS announced that the Bill would create a new criminal offence relating to “cyberflashing”. This would be constructed as recommended by the Law Commission.

A new section 66A would be inserted into the Sexual Offences Act 2003 to criminalise: intentionally sending or giving a photograph or film of any person’s genitals to another person with the intention that that person will see the genitals and be caused alarm, distress or humiliation, or for the purpose of obtaining sexual gratification and reckless as to whether the recipient will be caused alarm, distress or humiliation.

With thanks to contributions from online harms experts, Professor Andy Phippen FBCS, Dr Victoria Baines FBCS, Maeve Walsh, Associate at Carnegie UK Trust and Ellen Judson, Senior Researcher at CASM, the digital research hub at the cross-party think-tank Demos.