Data Protection 3.0: law and ethics

To my ex-programmer ears, phrases like “web 2.0” and “industry 4.0” always sound a bit odd. Sectors don’t have release dates, unlike Windows 10, iOS 12 or Android Oreo. Oddly, one field that does have major version releases is the law: it would be quite reasonable to view 25th May 2018 as the launch of Data Protection 3.0 in the UK. Looking at past release cycles, it seems likely to be fifteen to twenty years before we see version 4.0. During that time, the web, industry, education, technology and all other uses of personal data will develop in ways that those drafting the law could not have foreseen. The gap between what the law says and what technology makes possible is bound to increase.

You might hope that a brand-new law would at least answer current questions: “what can we (lawfully) do?”. But, with a ‘developer release’ fixed in 2016 and initial drafts dating back to 2012, even that may be optimistic. From now on it’s inevitable that we’ll be asking questions that the drafters could not have envisaged. Principles may remain valid – indeed the 2016 General Data Protection Regulation (GDPR) declares that the principles of its 1995 predecessor remain sound. But details will steadily diverge from reality.

To bridge that gap, we need to ask a different question: “what should we do?”. Not just because the law’s answers are unlikely to be clear (e.g. the long-standing legal distinction between controller and processor is very hard to apply to many cloud service models) but because such answers as it does give may no longer reflect individuals’ expectations (e.g. both regulators and web users seem increasingly uncomfortable with the law’s current rules on cookies). Even if the law does appear to give a clear answer to a question, we should probably double-check that with other sources.

We are already starting to see that, with universities and colleges asking “is this ethical?” rather than just “is it lawful?”. Reference points for that question include the Data Protection Principles, but also documents such as the Menlo Report: Ethical Principles Guiding Information and Communication Technology Research. Academic analysis of related areas such as smart cities can also provide useful sources to compare and contrast: if we are different, what risks and opportunities might that difference create? An ethical approach asks us to think hard about what we are doing, to document our draft conclusions and try to achieve consensus on an acceptable approach.

This may well be more time-consuming than simply looking up an answer in “the law”, but it should produce a more robust plan that can respond to legal, ethical and practical challenges. Some organisations seem to stagger from one legal, ethical or PR incident to the next. Doing ethics first, and being transparent about the process and outcome, should help us avoid that. Once we’ve decided what we should be doing, it’s easier to go back to the law, identify the relevant sections, and work out what it says about how we should be doing it.

Such a process is more likely to accord with another trend in both data protection law and public attitudes: away from simply dumping decisions on the individual (often labelled as “consent”, though it rarely satisfies either the ethical or legal definitions) and towards organisations taking responsibility and demonstrating that they are doing so. The GDPR calls this “accountability”, though it covers more than just holding organisations to account for errors. Individual rights and controls are still important when things go wrong – though we’d hope that that only happens where an unknown fact or unforeseen event affects the ethical assessment – but we shouldn’t be relying on them to correct the faults we leave in our designs.

Data Protection 3.0 should, perhaps, be less about “pushing the boundaries” and more about developing responsible practice. That might be confirmed by regulators and, one day, incorporated into the next release. “Compliance”, even if we could define it, should be the very least we aspire to.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *