Frog in a pan?

June 2016

Frog in a panA recent BCS workshop saw a group of experts and thinkers meeting under Chatham House rules, chaired by David Evans, BCS Director of Policy and Community, to take a first run at one of the Institute’s four challenges: personal data. Brian Runciman MBCS reports.

For BCS this is about long-term goals to achieve societal change: ‘making IT good for society’. This makes it about people, with this meeting being a starting point for experts to feed into the personal data challenge, with the views of the wider membership, the wider IT community and society to come.

The thinkers and experts at this kick-off meeting discussed some general points, and then a ‘strawman’ document on personal data goals.

So, what should the world of personal data look like? For you, your friends, your relatives and so on...

Understanding among the public is needed. IT is the ultimate black box - permissions on software are often just ticked without understanding, so a coherent policy for and from BCS needs to be widely comprehensible.

In organisations, data controllers collect data under very general terms to get maximum information. Terms and conditions are complex and make it difficult to get to the reason that data is being requested. Education is needed at a number of levels. For example, is it geolocation permissions that are being requested? And what does that mean?

It was noted that, anecdotally, for some financial web forms, even the speed that a form is filled in is tracked. Pauses that are tracked become personal data, especially when actions are taken based on the information. It seems that on sections around criminal records or past medical conditions, there is a connection between speed changes in form-filling and those who are dissembling. What type of data is that?

If time to respond versus accuracy is a measure used to assess truthfulness this could produce false assumptions without a user’s knowledge.

Semantics came up a number of times in the discussions. What is data? It is not just the information someone types on a form, but could include behaviour, how they walk for example, body language analysis and more. If that is codifiable as personal data, whose is it?

The US has facial recognition legislation that the UK doesn’t. One participant mentioned a thought experiment that posited a sex robot that could replicate an actual person by scanning for facial features, body shape and so on and imprinting them on the machine. It was noted that UK legislation doesn’t seem to totally prohibit this. So this personal data is another flavor: a set of digital characteristics.

Road-testing an approach

A draft document was put together as a basis for debate, containing an introductory scene-setting piece and three sub points. Each was discussed by the participants.

Scene-setter

We are committed to making personal data work for everyone: organisations, individuals and society alike. We want to put people in control and at the same time empower organisations to use data in more beneficial ways, with genuine trust on both sides.

Our goal is to achieve the full potential of data by seeking the best possible public benefit. We want to get the best outcomes for the most people with the least risk and harm. We believe this is not only desirable, not only achievable but essential to the future of our society. To achieve that goal, we believe the following are essential ambitions: (to list)

‘We are committed to making personal data work for everyone...’ there is an assumption in the first line. What do people consider personal data to be? They need to have trust and confidence in how personal data is used.

Confidence is usually underpinned by ‘understanding,’ but is that conditional on other contexts? For example, would a ‘good enough’ understanding suffice? We could hardly expect everyone to understand algorithms, for example. ‘Transparency’ may be a better way to express this requirement.

Confidence increases by knowledge of safeguards. But in the wider context, it was noted that there is a discrepancy between EU legislation and UK legislation on personal data use. That affects individuals and organisations for understanding and trust. The Middle East doesn’t have formal personal data legislation at all. Hence the picture for the ‘global village’ is somewhat confused.

Cultural differences also need to be taken into account. We are no longer UK-only citizens.

One group came up with a simplified aim: ‘Greatest utility, greatest trust’. Trust needs a failsafe mode, not to be perfect, but have effective mechanisms to deal with fails.

It was felt that the last sentence of the introduction should include the concept of confidence directly, i.e.: ‘our ambition is that people should have confidence in the following areas:’

It was also noted that the introduction was lacking focus on the questions of ownership and control. Again the tendency for these debates to lean towards semantics came up - a case in point being that the idea of ‘control’ was too strict in this context.

Point 1: Safety

Individuals and organisations should be able to safely share and use personal data without fear or anxiety. We want to work together to minimise risks and impacts, and increase public confidence in use of personal data.

This is a confidence issue, but it was felt that ‘increase public confidence’ is not dynamic enough a goal. It could be increased from one per cent to two per cent, but we shouldn’t consider that a successful outcome. This section also seems to imply the need for a regulator and the participants expressed a suspicious view of more regulation.

Safety is a component of trust, but again the idea of ‘transparency’ was mentioned. Transparency with the ability to compare personal data performance, to be able to compare choices of levels of control.

One group said that the first sentence needs to change to reflect that personal data belongs to the data subject. If they understand that they are more likely to take action and to be involved in these discussions.

Point 2: Integration

Personal data that can be linked and integrated around people and organisations is powerful and useful. The full potential of personal data will only be realised when both individuals and organisations can use it to inform decisions, simplify and improve life and work.

The initial point was made that this seems more of a statement of hope than an aim. One group suggested the phrase: ‘Shared value, shared utility’ on the basis that we want to make better use of personal data, but for the greater benefit of mankind, society and the individual.

It was felt that ‘integration’ as a term is not strong enough - we need to include the concepts of interoperability and portability of data across services. This statement needs to include the role of government too. A phrase that acknowledges the dangers of linked personal data was suggested as an addition.

It was also noted that integration is only desirable when there is trust, accountability and positive relationships between controller and subject. We need to achieve core aspects of the other goals to truly benefit from integration.

Point 3: Relationships

Personal data terms of use should be a conversation not an ultimatum. We want personal data use to support a dynamic and balanced relationship between individuals and organisations. These relationships should support trust, foster innovation and lead to maximum mutual benefit.

This item was generally viewed as being a strong ambition and a worthwhile goal. This is about ethics based on community values. But with differing communities, sectors and interests, what is a ‘good’ data relationship? Personal responsibility should be highlighted.

The principle of ‘privacy by design’ that is woven into a relationship was mentioned. This should be something done with people rather than to people, including the acceptance of risk with the commensurate understanding of the implications of that risk.

An example given was that of a brothel. This sort of establishment is an example of an ethically dubious business that could be run in a safe way with a code of conduct. The consumer would have choice, but with a knowledge of risks. Extreme sports is another example of where risks can be minimised to an extent, but participants have, and need, to realise the dangers.

If someone opts out of sharing data, should they benefit from the positive effects that other people’s sharing produce?

Trust in the data is also a requirement. Organisations given the approval to use it will only benefit from it, and potentially more widely benefit society with it, if it is able to trust its accuracy.

In an organisational context, with more people involved in decision making, conversations should be around not just what you can do, but what you should do.

What is the motivator for change? The point was made that just adding to regulation isn’t helpful. The example of financial services regulation was given, where it seems that very few of these are effectively enforced.

So is more regulation needed or, rather, better follow-up? In data, enforcement should be via a transparent approach. At the moment the Information Commissioners Office doesn’t have the resources to police outcomes - in short, legislation rarely prevents what it forbids. Regulation was generally seen as failing.

The point was also made that the Data Protection Act is good, if it worked as intended in the real world. Data subjects haven’t really used their allowances, their rights, for changing and checking their data. The education question and whether people can actually be made to care about these things came into discussion.

Should there be a specific point where education about data, its use and implications of its use comes in? If so, is this at school? During further education? At one’s first job? Individuals need to be up to speed on these issues, and a specific point of addressing the issues was seen
as beneficial.

The final point in this section was made on relationships between an individual and an organisation that wants to use their data: Is permission a one-off or a dynamic, ongoing conversation?

Some related thoughts

Our aim should be to create a market where desired behaviours prevail, but there is no point in creating laws for things we don’t like because it makes UK less attractive to do business.

We need positive ways of effecting change. A further step down the line could take into account the perception that major companies seem to do what they like and get away with it. So could we go so far as to make a world where personal data becomes valueless through effective rights management?

The trust issue needs assessing. Knowledge empowers people to ask questions. People need understanding to ask reasonable questions of the market. Perhaps, within a defined area, you don’t need to care what happens to your data because it’s doing its functional purpose, but when the purpose changes then you need to know.

So clarity and transparency are needed, but we don’t need to have this on our minds on a day-to-day basis. Knowledge, trust and accountability may not need a deterministic complete understanding of how data is used but, trusting that it will be good, could even lead to a pleasant surprise in how it’s used.

Individuals need to feel ownership of their data in a legal sense and an emotional sense. The legal angle may be, of necessity, complex, but people should be able to feel in control. At the moment, are people so used to the system, to the status quo, (almost institutionalised) that they are comparable to the proverbial frog in a cold pan of water, who doesn’t react as the temperature rises because they are unaware of any other options?

Any approach for the public at large also probably needs to be relatively effort-free. It is too much to expect all members of society to react well to an attitude of ‘if only everyone LEARNED’.

Convenience tends to trump all in society now, behaviour that has only been exacerbated by app-driven culture. The example was used of the https protocol, which works largely unseen. If there were other steps required to login to, for example, your bank website, it probably wouldn’t be used.

As a starting point… what could BCS do?

The idea of this workshop was not to go straight to possible solutions to these issues however, inevitably, these conversations tend that way. Some ideas put forward included:

  • An education programme. Are people resigned to use of personal data or ignorant of it?
  • BCS could provide advice on the concepts of personal data use as against actual tools and products. Perhaps these could be ‘toolkits’ so the public understand the trade-offs of what they reveal and what they could get, as in current
    ‘freemium’ models, for example.
  • BCS could institute a scheme to assess organisations and their use of personal data by agreed criteria.
  • BCS could demonstrate viable alternatives, providing use cases, best practice for consent and so on.
  • BCS could provide advice to enterprises on how to maintain the public trust. As a general approach, we need to present this as a battle individuals can win.
    • A ‘tick’ scheme, like the BSI kitemark, with a logo that works as a ‘good data use’ tick could be developed. This would include definitions and raising the issues of basic individual rights.
    • Perhaps a lighter version of the above approach would be better, analogous to the building trades Considerate Constructors’ Scheme.
    • If the UK government has outsourced data regulation to the EU, which some contended, then we also need a lobbying campaign with our own parliamentarians.
    • Perhaps there is an entrepreneurial opportunity for ‘Trusted Personal Data
      Controllers’.
 
Further reading
 

Image: iStock/Sascha Burkard

Comments (1)

Leave Comment
  • 1
    Simon Grant wrote on 5th Jul 2016

    A great deal of work has been done in recent years around personal data in learning, education and training. The issues are far from clear-cut, needing to balance at least privacy and usefulness. My colleagues in Cetis and I have done much work in this area.

    Simon

    Report Comment

Post a comment