Tech issues have never been more central to government policy and decision-making, with choices taken now liable to affect us for decades. Theo Knott, Policy and Parliamentary Affairs Officer at BCS, reports on two areas, internet safety and AI, where BCS has been influencing government decisions recently.

We live in a time of sweeping change. Relative to wider human history, recent decades have been tranquil; consensuses have held, institutions appeared solid and political scientists wrote papers questioning whether we had reached ‘the end of history’.

Few people would disagree that this period of tranquillity is reaching its conclusion and, if anything, change appears to often be outstripping our capacity to deal with it.

One of the great engines of this change is undoubtedly technology. The continued development of AI, big data, internet platforms and more will fundamentally change how we all live our lives; it is already doing so. It is fair to say that the political establishment has often been reactive in issues of technology when it desperately needs to be proactive.

Simply put, we are at a stage when politics must do better. It is not hyperbole to say that many decisions taken by policymakers regarding technology in the coming years will alter the human experience for the foreseeable future. As a result, the role of BCS as an independent voice in tech, with a membership of IT professionals and experts, is more important than ever.

Examining BCS’s place

This has been reflected by the fact that BCS has been at the heart of two major government announcements on technology, in recent weeks, in internet safety and AI. The first of these areas has been the Internet Safety Strategy, which is being developed by the government as the linchpin of its Digital Charter; a series of policies that intend to exploit the opportunities and mitigate the negatives of new technologies.

Keeping young people safe online is a challenge commensurate with painting the Forth Road Bridge; just as you think you have things resolved, the online environment shifts and needs resolving again. Few from inside the tech profession would argue that the internet has been anything but a force for human good, but this ought not detract from the fact we have an obligation to keep the vulnerable as well protected as we can.

Toward internet safety

The Internet Safety Strategy is being designed to try and arrest this issue, by putting stronger measures for online safety into place. BCS has collaborated with the Department for Digital, Culture, Media and Sport (DCMS), the department responsible for the strategy, to build an evidence-base that will help ensure these new measures are fit-for-the-purpose of protecting young people from threats online.

Following meetings with DCMS last year, BCS decided to conduct a poll of school-age children regarding their views on social media platforms and companies. This survey was sent to teachers in the Computing at Schools network, which includes those teaching at over 1,700 primary and secondary schools in England, asking them to ask their pupils to fill out the survey.

The survey elicited a brilliant response, with over 6,500 young people aged between seven and 18, providing their views, representing possibly the most comprehensive survey of this kind. These results were then shared with DCMS, in addition to analysis and recommendations on what the next steps should be.

The government’s response

Recently, the government released its official response to the Internet Safety Strategy consultation and BCS is referenced more heavily than any other non-governmental organisation. More than this, you can see that our survey and wider work in this area has helped to inform and alter government thinking.

For example, our results found there is a strong consensus among young people that social media platforms should be automatically removing abusive and offensive posts. The government has consequently committed to providing greater guidance on privacy controls through their forthcoming social media code of practice, so that users have more ability to block people who potentially could send this type of negative content to them.

Potentially the most significant change from consultation to response has been the willingness of the government to look deeper at the role that education must necessarily play in keeping children safe online.

Our survey discovered that 72 per cent of school children would like more online safety education and a key conclusion from our wider consultation responses was that online resilience through education must not be forgotten in a world where it’s impossible to entirely snuff out online threats.

The government’s positive thinking in this area has been encouraged and influenced by BCS utilising its networks and feeding its membership views into policymakers.

Artificial intelligence

If action on internet safety and the scope of internet regulation is very much a challenge of today, AI and automation is rightly being seen as one of the preeminent challenges of the near future.

The scope of change is potentially as far as the mind can envisage and decisions around AI and security, ethics and job-displacement (just to name three areas) are among the biggest questions politicians of all stripes will have to grapple with in the coming years.

Consequently, as part of the government’s Industrial Strategy, in April it was announced that AI was to be granted one of the first ‘sector deals’; comprehensive partnerships between government and industry in fledgling sectors that could boost productivity, employment and innovation.

One of the points made by BCS regarding AI is the need to equip people with the skills to harness it and this was reflected during London Tech Week as details of the sector deal were fleshed out.

BCS’s place in the debate

Greg Clark, the Business Secretary, put forward a range of new policies centred around AI, with BCS playing a key part in delivery . These new policies included around £300 million of new funding for AI research, an expansion of government-funded AI PhD’s and further details on the forthcoming Centre for Data Ethics and Innovation.

Undoubtedly, one of the central announcements was the launch of a new AI Masters programme, which will see BCS working with universities and organisations like the Turing Institute and Amazon to create a new qualification appropriate for a technology set to fundamentally change how we live our lives.

While AI and internet safety are just two of the areas that BCS has worked with the government on this year, they illustrate the variety of topics we are involved in and the tangible change we can deliver. It’s worth reiterating the unique role BCS plays as an organisation without political affiliation or agenda and a membership of 70,000+ IT professionals and experts.

At a time of sweeping change, these qualities are perhaps of more relevance than ever before. Informing politicians and improving policy go hand-in-hand with BCS’s values of making IT good for society. So, while the challenges in technology ahead are daunting, it’s an invigorating time too; a time when BCS members are really changing policy for the better.