Categories
Articles Tools

Navigating the Temptations of Data

It seems easy to come up with new ways we might re-use data we already have. But harder to work out, in advance, whether an idea is likely to be perceived as unethical, intrusive, or just creepy. In a recent paper – “Between the Devil and the Deep Blue Sea (of Data)” – I explored how simple questions might help us look at ideas from different perspectives and identify the ones most likely to be accepted.

Before starting to implement anything – while the idea is at back-of-an-envelope stage – we can discuss it with stakeholders, to test how they are likely to respond. This should include at least those whose data will go into the proposed process, those who will do the processing, and those who will use the results. Some questions to frame that discussion:

  • Will it help? Do we have a process/resources to act on any signals the data may contain?
  • Will it work? Now we know what the process is, do the data contain the signals that process needs?
  • Will it comfort? Or might it make people nervous, whether about the data we are collecting, how/why we are processing it, and who can see the data or results.
  • Will it fly? Are the assumptions we are making valid, and will they stay that way? This may be particularly tricky when the aim of the process is to modify the environment within which the data are collected. Will the process have different effects, either in terms of traditional discrimination, or for groups with different digital footprints, because of different learning styles, different equipment, or different experiences?

If we still have a “warm feeling” about the idea, Data Protection law can provide further sanity checks, even if we don’t think we are processing personal data. For now, we are mostly looking at points to discuss – though the information gathered will be a great help if we do come back later to look formally at legal requirements. But if it’s hard to explain the answers to these high-level questions (based on the Information Commissioner’s Twelve Steps to Prepare for GDPR), that’s probably a warning sign that the idea itself needs more work:

  • Data Protection by Design/Impact assessments. What are the benefits and risks for individuals (not – at least not directly – the organization)? Are there ways we could reduce the risks?
  • Information Lifecycle. How do we collect data, use it, do we need to involve others, when and how will we destroy it?
  • Breach Notification. How will we know when something has gone wrong and limit the harm? How might our handling and reporting of breaches – to users, regulators or management – increase or decrease trust?
  • Legal Basis. This may sound dry, but having a clear legal basis will make your proposal much easier, and more reassuring, to explain. It’s also the gateway to lots of useful guidance. Is it needed
    • to deliver an agreement you’ve made with the individual (e.g. processing bank details to pay salary),
    • to meet a legal requirement (informing the tax office about the salary),
    • to protect life (from an imminent threat),
    • to serve a public interest (something that’s within our lawful remit),
    • to serve a legitimate interest (whether ours, individuals’, third parties’). And are we sure that that benefit doesn’t come at too high a cost to the rights and freedoms of individuals?
    • Or, if you can’t fit it into any of the above, can you meet the high bar for giving students or staff a genuinely free and easily withdrawn choice?
  • Privacy Notice. The content of the notice flows fairly automatically from the questions above, but how will you communicate a new use of data that you already have?
  • Individual Rights Processes (information, subject access, objection, etc.). As with Privacy Notices, if these look hard to provide, then the proposal may need rethinking to reduce the risk of it being seen as unethical.

Discussing these questions early in the design stage doesn’t just test out whether the idea is a good one. It can also reveal opportunities to make it better. For example if there’s a spread of opinion as to whether it seems creepy, can we make it optional, or add in additional controls for those (users and stakeholders, as well as data subjects) who are uncomfortable with it? If not, maybe that’s another warning sign. If we can, then it’s much easier to do it early than late in the development process.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *