How to police personal data use

With companies and charities alike being investigated for pushing the legal boundaries of personal data use, BCS asks how much societal consent must exist for effective regulation to be possible.

TPP, a UK based IT company with the admirable aim of joining up the country’s fragmented healthcare systems with IT, recently found itself under investigation by the data protection regulator, the Information Commissioner’s Office.

One of the company’s IT systems allows hospitals, care homes and community services to access GP records and leave their own notes. It turns out the slightly ominously named ‘enhanced data sharing’ function of this system had led to GPs being unable to tell their patients exactly who could see their data, purportedly breaching data protection legislation.

In a separate case, questions have been asked about data company Cambridge Analytica’s use of personal data to “supercharge” the Leave.EU campaign in the run up to the EU referendum last year.

The company uses data analytics to create profiles of individuals and identify key swing voters, allowing them to be targeted with tailored campaign messages. The ICO has stated it “has concerns” about the company’s use of personal data in the UK, and has approached it as part of a wider assessment of data protection risks in the use of data analytics.

Last week the ICO dealt out hefty fines to 11 major UK charities for breaking the law in their use of donors’ personal data. A range of offences had been committed by the charities, with some hiring companies to collect donors’ information without permission, and some sharing donors’ data with each other without the individuals’ permission. The list of offending charities was a veritable who’s who of household names, including the NSPCC, Macmillan Cancer Support and WWF.

The world of cybercrime has long been likened to a lawless ‘Wild West’ environment, but these recent cases are far from the world of deliberate illegality. More and more companies like TPP and Cambridge Analytica, along with third sector organisations, are realising the value of personal data and the commercial possibilities it can produce when interrogated with new, innovative methods. Hungry to innovate and create new services or products, these organisations are increasingly pushing the boundaries of what they can do with data.

Good and effective policing is only ever really achieved through consent. Regulatory bodies are no different, and rely on general adherence to the rules by those they are seeking to regulate. It therefore follows that there must be enough general compliance to the rules for them to stand a chance of enforcing them. The problem for the ICO - which has more than 400,000 data controllers registered with it - is that policing and regulating this kind of environment is hugely difficult without a base line of societal understanding and acceptance of where those boundaries lie. 

So far, with huge swathes of the public struggling to understand (let alone form any kind of informed opinion on) the use of personal data, there is little in the way of a developed consensus about what the rules around it are, or should be. The public debate on the topic remains in relative infancy, with many only considering it when reports of data hacks or illegal data transfers hit the headlines. We are, in effect, creating our social understanding of the limits one headline at a time. This is not good for society, and it is a risky way of conducting business as well.

While organisations are steadily getting to grips with their data protection requirements (with increasing urgency as the new GDPR implementation date looms), the lack of societal understanding or consensus on what they should or shouldn’t be doing with the personal data they hold will not be helping to discourage those companies from innovating their way into the ICO’s spotlight.

This is not a quick or simple issue to fix, but increasing the public dialogue on the subject of personal data is crucial in order for a social consensus to be reached.

The requirement of reaching GDPR compliance by May 2018 looks likely to cause many companies some severe headaches as they are forced to implement ‘consent by design’ in all aspects of their data handling. Hopefully one positive consequence of this will be that the millions of customers having consent requests thrust in front of them will be forced to consider how much sharing of their personal data they actually consider to be acceptable.

James Davies, BCS Policy Programmes Manager

Comments (1)

Leave Comment
  • 1
    Tim Turner wrote on 12th Apr 2017

    There is no requirement anywhere in the GDPR for an organisation to implement 'consent by design'. There is a separate concept of 'Data Protection by design and by default', but this is very different from the implication in this article that consent has to be wired in "all aspects" of an organisation's data handling. It is shocking that an organisation with the BCS' role in educating people about Data Protection is capable of promoting an error as basic as this.

    Report Comment

Post a comment

About the blog

The BCS Policy team works to inform and drive the debate on public and private IT policy developments.

See all posts by

Search this blog

July 2017
M
T
W
T
F
S
S
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31