BCS, The Chartered Institute for IT, was pleased to attend the evidence gathering session at the Houses of Parliament on Monday 24 February, hosted by the APPG on Artificial Intelligence. Victoria Temple reports.
With the theme Beyond GDPR, the session brought together a panel of experts from the commercial world, academia and the legal profession and others, for what proved to be a fascinating discussion. Chaired by Lord Clement-Jones CBE, the hearing looked at the implications of data ownership, privacy and user rights, with fundamental questions about how data should be owned, such as how can data deliver public value without harm for the individual?
Dr Florian Ostmann, Policy Theme Lead and Policy Fellow at the The Alan Turing Institute said that businesses and organisations needed clarity and reassurance: ‘The challenge is to ensure that consent for the use of GDPR is meaningful and informed,’ he said. ‘We must provide assurance to the consumer that data will only be used in a beneficial way.’
Harnessing the value of health data - ethically
Dr Kenan Direk, Research Data Manager, UCL Institute of Health Informatics, spoke about the ethics of using health data for epidemiological research, as well as planning health provision in the future. He described how using health data effectively has huge value for public health but, for that potential to be harnessed, there has to be transparency and meaningful engagement with the public about the value of that research. ‘That can only happen with the right regulatory framework in place,’ he said.
Tamara Quinn, Non-Contentious Intellectual Property and Data Privacy Partner at Osborne Clarke, spoke about the commercial and legal implications of data regulation. ‘Most companies which collect data are not AI or tech companies,’ she explained. ‘What really matters is creating clarity for business so that they can invest appropriately.’
The legal framework is struggling to keep up with the implications of using data, particularly in health; she told the APPG that the process to anonymise health data can itself be a breach of GDPR. ‘There are real problems here,’ she said, adding ‘we should not under-estimate the extent to which GDPR compliance is informed by the market. If an organisation sees its competitors behave in a way which is not consistent with GDPR, it can be quite hard to persuade that company to do the right thing.’
What rights do employees have over data?
Quinn outlined the implications of the UK’s exit from the EU, saying that this potentially gives the UK freedom to create rules on data regulation, but stated that there is also ‘an inherent difficulty in trying to take the benefits of Brexit but make sure we are aligned for practical purposes with Europe.’
Andrew Pakes, research director for trade union Prospect, spoke about the use of personal data by employers in the workplace. ‘Increasingly, we see members on the receiving end of new technology in the workplace,’ he said, adding that there ‘was a lack of framework about how employers handle employees’ data.
‘Unless we have some concept of group rights, we risk seeing data used to make decision about people where there’s no balance of power. We want to embrace the future of work but want to get it right.
‘Workers are often not regarded as having a voice in change and are perceived as widget rather than having autonomy about how data is used.’
Assessing the value of data
James Kingston, Deputy Director at Hat Lab, discussed how we assess the value of data, explaining that it is commodity which is essentially a ‘store of value’ derived only from movement and the ability to develop transactions.
Prof Ryan Abbot, Professor of Law and Health Sciences at the University of Surrey, asked the APPG to consider the ownership of data, using the example of self-driving cars which could generate valuable data for insurance companies and others, but it was unclear about who would ‘own’ that data. ‘Data can contribute to knowledge, but data monopolies make it difficult for small firms to compete,’ he said.
The All-Party Parliamentary Group on Artificial Intelligence (APPG AI) was set up in January 2017 with the aim of exploring the impact and implications of Artificial Intelligence. Co-chaired by Stephen Metcalfe MP and Lord Clement-Jones CBE, Monday’s evidence-gathering session demonstrated how the APPG continues to provide valuable work in bringing together experts and seeking to address the big questions about AI. At BCS we look forward to the next event.
BCS, The Chartered Institute for IT, was commissioned by the Office for AI to conduct an independent review into how we create a substantial and diverse pipeline of ethical AI MSc graduates to bridge the skills gap.
The government has recognised the demand for at least 3,000 MSc graduates every year. The report, Scaling up the Ethical Artificial Intelligence MSc Pipeline, to create a pipeline of ethical, competent MSc graduates skilled in machine learning and AI, was published in June 2019.