There needs to be an open and honest dialogue between lawmakers and those who are tasked with implementing legislative demands. F-TAG explores the debate.

‘IT WOULD BE EASY TO SEE THIS ADMISSION OF LAX DATA PROTECTION AND CYBERSECURITY PRACTICES AS ANOTHER OPPORTUNITY FOR MOCKERY.’

‘Platforms can conduct risk assessments and provide suites of tools for end-users to make use of, but users need to understand the role they have to play.’

The recent announcement of Nadine Dorries’s appointment as Secretary of State for Digital, Culture, Media and Sport raised concern in the press. These resulted from historical tweets that seemed to suggest a lax attitude toward data protection practices.

The tweets in question were made in 2017, during a media story reporting on whether the MP Damian Green had historically accessed pornography on his laptop.

The initial story began in 2008 where Mr Green’s laptop had been examined as part of an investigation into Home Office leaks. The media reporting, as a result of comment from a former police officer who had examined Mr Green’s laptop, somewhat inevitably, spilled on to social media with Mr Green defending his integrity in the face of mockery.

During this social media storm, Ms Dorries tweeted:

“My staff log onto my computer on my desk with my login everyday. Including interns on exchange programmes. For the officer on @BBCNews just now to claim that the computer on Greens desk was accessed and therefore it was Green is utterly preposterous !!”

While some naively suggested this might present a challenge to national security, Ms Dorries further tweeted that there was no national security risk because she was but a backbench MP with no access to official secrets:

“Flattered by number of people on here who think I’m part of the Government and have access to government docs. I’m a back bench MP - 2 Westminster based computers in a shared office. On my computer, there is a shared email account. That’s it. Nothing else. Sorry to disappoint!”

Other MPs defended both Mr Green and Ms Dorries’, suggesting they were equally guilty of such lax access control on their parliamentary computers. In since retracted tweets, Nick Boles MP and Will Quince MP stated, respectively:

“I certainly do. In fact I often forget my password and have to ask my staff what it is.”

“Less login sharing and more that I leave my machine unlocked so they can use it if needs be. My office manager does know my login though. Ultimately I trust my team.”

At face value this shows a lax attitude toward data security and protection. Even if they are simply back bench MPs, as a result of the role they play, they will undoubtedly be sent communications from constituents of a private nature. These will almost certainly contain sensitive personal information pertaining to their circumstances and, therefore, undoubtedly be worthy of protection under the Data Protection Act.

Taking data protection seriously

It would be easy to see this admission of lax data protection and cybersecurity practices as another opportunity for mockery. However, away from the froth of social media factionalism there is a more serious point to make.

What happens when our law-makers fail to understand digital technology, and how to use it responsibly, when they are developing legislation to control it?

At the time of writing the Telecommunications (Security) Bill is moving into its final stages of examination before royal assent, the Online Safety Bill is being subject to pre-legislative scrutiny and there is much talk about a move away from the GDPR with the country’s post-Brexit data protection legislation. All of these pieces of legislation have the potential to place a great deal more regulation on the IT sector.

This is not without merit, clearly there are some facets of the IT sector that suffer little ethical challenge or feel they have responsibility for how citizens use their services. However, there is a need for this legislation to understand the nature of digital technology and that it does not place expectations that are not easily achieved through code.

Social media and politics

It might be easy for a politician to say that they want a social media provider to stamp out racism on their platform. However, the reality of that expectation to use code to solve a social problem manifest online is far more challenging. Technology is very good at some aspects, such as keyword matching racist terms and automating take downs of these sort of posts, and suspension of accounts posting.

However, stamping out racism in posts requires a far more intelligent interpretation of what is posted when free of such keywords. It also calls for far more complex algorithms to achieve this without false positives. These might result in the platform accusing a user of racism when a post contained no such thing.

Far better, we would suggest, that an informed user-base understands the tools provided by the platforms for reporting offensive content, with a confidence that the platform will take it down when reported.

Tech policy is littered with government demands that have fallen by the wayside. In admirable efforts to prevent the access of pornography by young people, default-on filters provided by ISPs are subsequently switched off because, while blocking some sexual content, they will also frequently block innocuous or even valuable educational content around gender, sexuality and relationship education.

A shifting and complex relationship

The holy grail of age verification continues to be pursued, even after its failures in legislation (the now revoked Part 3 of the Digital Economy Act). It turns out such practices are technically extremely complex in a country that has no formal ID card scheme, and the technical workarounds to bypass geographically specific controls are easy.

This is not to say these efforts were not honourable or devoid of good intention. However, the lack of technical knowledge, and a failure to appreciate the complexities of the relationship between society, technology, and law, result in many failures of legislation. Digital technology, and technology companies, certainly have a role to play, but they are but one stakeholder in this complex relationship and simply shouting ‘do more’ at them will not solve these social problems.

For you

BCS members can read the very latest F-TAG technical briefings and reports.

Platforms certainly have a role to play. Some of the more positive aspects of the draft Online Safety Bill require providers of user-to-user services to conduct risk assessments for the potential harms that might occur, and to provide tools to mitigate risk. However, it is telling that in the 145 pages of the draft legislation there are only two occurrences of the word ‘education’.

Platforms can conduct risk assessments and provide suites of tools for end users to make use of, but users need to understand the role they have to play. The responsible end-user is an essential part of the data protection stakeholder model. And legislators need to appreciate what technology is, and is not, capable of and play their part in showing an example in responsible use.

Professionalism's place

BCS, the professional body for IT in the UK, offers a substantial knowledge base of technical knowledge, consisting of around 60,000 members, with a commitment to ‘Make IT good for society’.

Within its members there are specialists from all aspects of the sector, from theorists to hardcore technical. By working together, BCS can provide a great deal of input into the legislative process, and advise on what is, and is not, possible with technology and how tech regulation can be effective for all stakeholders. However, this does require an open and honest dialogue between those who legislate and those who will ultimately be tasked with implementing legislative demands.

Andy Phippen FBCS is a professor of digital rights at Bournemouth University and is a Visiting Professor at the University of Suffolk.