Categories
Articles

GDPR: Data Protection Impact Assessments

The Article 29 Working Party of European data protection supervisors has published the final version of its Guidelines on Data Protection Impact Assessments (DPIAs). These build on the long-standing concept of Privacy Impact Assessments, being similar to normal risk assessments but looking at risks to the individuals whose data are being processed, rather than to the organisation doing the processing. Having identified the risks, the DPIA process should then consider how they may be mitigated, and ensure that this reduces them to an acceptably low level.

Under Article 35 of the GDPR, performing a DPIA is mandatory for any processing activity that represents a high risk to individuals. The Guidelines provide a list of nine characteristics of processing – evaluation or scoring, automated decision-making with significant effect, systematic monitoring, sensitive data, large-scale processing, combining datasets, vulnerable data subjects (including employees), innovative technological or organisational solutions, processing that prevents individuals exercising their rights – and suggest that any activity including two or more of these is likely to require a DPIA. A table of worked examples provides useful comparisons for organisations assessing their own activities. In addition, supervisory authorities are encouraged to make lists both of activities that do require a DPIA and those that do not.

Once a DPIA has been decided on, the next question is which risks need to be assessed. Here the guidelines provide little help. Although “privacy” and “data protection” are different rights in European law, here “Privacy Impact Assessment” and “Data Protection Impact Assessment” appear to be treated as synonymous. Annex 1, which suggests existing processes likely to be satisfactory, includes both types (including the Information Commissioner’s PIA Code). It’s therefore unclear whether a DPIA should look only at risks to non-public data, or include issues such as potential misuse of public directories (a DP issue, but not a privacy one) or, as suggested on page 6 of the guidelines, risks to all rights and freedoms, including free speech and freedom from discrimination.

The guidelines aren’t sufficiently detailed, in themselves, to be used to conduct a DPIA. Instead organisations could look at the various Codes referenced in Annex 1, or else use the list in Annex 2 of features of a DPIA to perform a gap analysis against their existing risk assessment and development processes to determine how these could be developed into an acceptable DPIA.

Formally the legal requirement to perform a DPIA only applies to new activities and those where risks have changed. The draft guidelines contained a specific deadline by which existing high-risk processing should be subject to a DPIA; this has now been replaced by an expectation that this will happen as risks to personal data are periodically reviewed. The guidelines also note that performing a DPIA and publishing a summary can help to build confidence in an organisation and its processing, so there may be benefits from applying the approach more widely.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *