Blaming social media algorithms for harming children without equal emphasis on education risks the Online Safety Bill’s success, the professional body for IT has warned.

As the Online Safety Bill is introduced to Parliament after several major updates, the  Digital Secretary Nadine Dorries said: “If we fail to act, we risk sacrificing the wellbeing and innocence of countless generations of children to the power of unchecked algorithms.” 

BCS says singling out algorithms as responsible for children’s online safety was ‘passing the buck’. A combination of ethical design choices in those algorithms, supported by strong digital education and media literacy, was more likely to make the internet safer long-term, BCS added.

Ethical choices

Dr Bill Mitchell OBE, Director of Policy at BCS, said: “It’s true that the public doesn’t trust any organisation - including tech companies and governments - to use algorithms to make decisions about them.

“But blaming the algorithm in this way is passing the buck. It’s how we collectively make design choices is where the problem lies. It’s also when we make badly informed choices or ones that are influenced by our own ignorance or unconscious prejudice - that is how we end up with algorithms that cause harm. 

For you

Be part of something bigger, join the Chartered Institute for IT.

“When algorithms are ethically designed and competently developed, they can genuinely help improve our daily lives and the chances of solving the big problems in the world, such as caring for an ageing population and climate change.”

The Online Safety Bill has been in progress for around five years. Ofcom is the new regulator for the sector with the power to fine companies or block access to sites that fail to comply with the new rules. BCS welcomed the oversight role of Ofcom and the requirement for platforms to do risk assessments and produce plans to mitigate against certain kinds of harms.  

Better education

Professor Andy Phippen, a Fellow of BCS and a specialist in Ethics and Digital Rights at Bournemouth University, added: “The rhetoric around the bill seems to be around making children safe, but the entire drive of the proposed legislation is tech sector regulation. 

“When I talk to young people about online safety, they usually say they need supportive adults who understand the issues and better education. None have ever demanded that big tech billionaires need to be brought to heel. I recently spoke to a group of young people who were clear that good online safety comes from good education and the opportunity to discuss and ask questions.”  

Whole society response

Dr Victoria Baines, a Fellow of BCS and a specialist in online safety investigations and policy, added: "By putting emphasis overwhelmingly on technical solutions and Big Tech responses, the Online Safety Bill as formulated appears to wilfully neglect the well-documented constant that some people will always seek to harm other people using available technology. Cybercrime and child exploitation are societal problems that require whole society responses. The government's stated aspiration to 'make the UK the world's safest place to go online' will not be realised without them.

“All the technical measures in the world are no substitute for adequate resources for specialist law enforcement, mental health support services for young people, and effective programmes for fostering critical thinking and digital good citizenship."