Categories
Articles

Data Protection Reform?

Looking at the contents of the Government’s new Bill suggests it may be more about Digital Information than Data Protection:

  • Personal Data Processing (1-23)
  • National Security & Intelligence Services (24-6)
  • Information Commissioner’s Role etc. (27-43)
  • Miscellaneous (44-5)
  • Digital Verification Services (46-60)
  • Customer & Business Data (a general framework for services like Open Banking) (61-77)
  • Privacy and Electronic Communications (78-86)
  • Trust services (87-91)
  • Public service data sharing etc (92-99)
  • Information Commission Governance (100-106)

Even those first 23 clauses, which are about the day-to-day processing of personal data, are largely clarifications or re-phrasings of existing (UK) GDPR and Data Protection Act provisions, so seem unlikely to result in organisations changing their existing processes.

The Act is presented as a series of amendments to existing laws, which makes it hard to interpret, but things I spotted include:

  • New (narrower) definition of personal data (c1). The original GDPR definition (Art.4(1)) depends on the phrase “identifiable natural person” but didn’t answer the question “identifiable by whom?”. For a while there were two conflicting German cases, one saying “by the data controller”, the other “by anyone in the world”. The DPDIB goes for the former. As far as I know this was never resolved at EU level: perhaps because, when designing processes, it doesn’t really matter. Any collection of data is likely to include some people who the data controller can identify – customers, staff, those who have self-identified, … – so you have to have a process for people “identifiable by me” anyway. And if you have such a process it seems like a lot of work to develop a parallel process to work out which records are “not identifiable by me” and handle those differently.
  • Research and Statistical Purposes (c2,3,9&22). These sections largely consist of text copied or redrafted from the GDPR’s many Recitals (notably 156-162) and Articles (principally 89) on “scientific research”. It has been suggested that the new definition in c2 expands such research to cover both commercial and non-commercial activity, but I can’t see anything in the original GDPR (several different language versions) that currently limits those provisions to “non-commercial” activities. Indeed the current ICO guidance says “It can also include research carried out in commercial settings, and technological development, innovation and demonstration”.
  • Recognised legitimate interests (c5 & Annex 1). These seem to be a list of purposes that could have been covered by the existing “public interest” and “legitimate interest” justifications. Worryingly, although the category name still mentions “legitimate interest”, there doesn’t seem to be any requirement to consider the interests or fundamental rights and freedoms of individuals, as under the old Article 6(1)(f) Legitimate Interests. We have found that balancing test really useful to reassure both ourselves and our users that we are applying appropriate safeguards, for example when processing to ensure system, network and data security.
  • Automated decision-making (ADM) (c11). This seems to answer another long-standing (since 1995!) open question: whether a “right not to be subject” to ADM is a preemptive ban on making such decisions, or creates a right of human review of individual decisions. In the UK, under the DPDIB, the answer seems to be a ban on ADM using special-category data (unless “necessary for reasons of substantial public interest”, in Art.9(2)(g)) but a right of review otherwise.
  • Data Protection Impact Assessments (c17&18). These have been relabelled as Assessments of High Risk Processing, and several sources of guidance have been removed, but the basic idea remains: when planning processing the controller must think about potential risks to individuals’ rights and freedoms and take documented measures to ensure any high risks are mitigated. Jisc has found this a very useful tool in planning our own services and reassuring their users (see for example our published DPIAs on network security and learning analytics).

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *