Plans to compel social media platforms to tackle online harms are not fit for purpose according to a new poll of IT experts.

The Online Safety Bill would put a duty on tech giants like Facebook and Google to develop systems to identify and remove illegal material, as well as deal with content that is harmful to adults and children. Ofcom would enforce this as the regulator.

Only 14% of tech professionals believed the legislation was ‘fit for purpose’, according to the survey by BCS, The Chartered Institute for IT.

Some 46% said the bill was not workable, with the rest unsure.

Freedom of speech

The legislation would have a negative effect on freedom of speech, most IT specialists (58%) told BCS.

Only 19% felt the measures proposed would make the internet safer, with 51% saying the law would not make it safer to be online.

There were nearly 1,300 responses from tech professionals to the survey by BCS.

Legal but harmful content

The government has stressed the bill does not require removal of legal content; but larger platforms must set out how ‘priority content that is harmful to adults’ – such as suicide related material - is dealt with by their service.

This could include specifying that the content would be removed, or deprioritising it in news feeds or simply making it clear what content is freely accessible. Types of harmful content will be specified by parliament; platforms must also take steps not to limit freedom of speech.

For you

Be part of something bigger, join BCS, The Chartered Institute for IT.

Just 9% of IT specialists polled said they were confident that ‘legal but harmful content’ could be effectively and proportionately removed.

Some 74% of tech specialists said they felt the bill would do nothing to stop the spread of disinformation and fake news.

Just one quarter (25%) felt it was realistic for Ofcom to ask platforms to ‘develop or source’ accurate technology to detect sexual abuse material; 57% said it was not.

The Online Safety Bill is now due to return to parliament after the new Prime Minister is chosen on 5 September.

Legislation needs ‘fundamental review’

Rob Deri, Chief Executive of BCS, The Chartered Institute for IT said: “There is real need to prevent online harm, but this law only goes part way to trying to achieve that. The aim should be to prevent hatred and abusive online behaviours, by stopping harmful material appearing online in the first place – and that takes a mix of both technical and societal changes.

“A new Prime Minister should take the opportunity to fundamentally review the Online Safety Bill in its current form.

“The technology itself has an important part to play in keeping people safe on social media platforms. However, the Bill leans too heavily on tech solutions to prevent undesirable content, which can’t be relied upon to do that well enough, and could affect freedom of speech and privacy in ways that are unacceptable in a democratic society.

“The legislation should also focus on substantive programmes of digital education and advice, so that young people and their parents can confidently navigate the risks of social media throughout their lives.”

View the survey findings

About the survey:
This report was generated on 02/08/2022. Overall, 1296 members of BCS, The Chartered Institute for IT completed the poll during the period 21/07 to 02/08 2022.