Categories
Tools

DPIAs: First Attempts

Article 35 of the General Data Protection Regulation introduces a requirement to conduct a formal Data Protection Impact Assessment (DPIA) for any processing that may involve a high risk to individuals. The Article 29 Working Party’s DPIA guidance contains a helpful list of nine factors that may give rise to a high risk. Any activity involving two or more factors is likely to require a DPIA. We identified two of Jisc’s services as perhaps reaching those thresholds – the Janet Security Operations Centre (SOC) and the Learning Analytics Service – because both involve large-scale processing and innovative technologies. Although the GDPR formally only requires a DPIA for processing that begins after 25th May, we considered that a DPIA would be a good way to reassure both ourselves and Jisc customers that the protections designed into both services were indeed appropriate.

To allow any changes to be incorporated into the current development cycles of the services, we needed to do the DPIAs this spring. That meant we couldn’t use the UK Regulator’s guidance, which was published in draft the day before our first information-gathering meeting. Instead we developed a process based on the GDPR itself, the Article 29 guidance and the DPIA toolkit from the French Regulator, CNIL. For each service Jisc’s data protection team spent half a day talking with service managers, guided by an informal data gathering crib-sheet that we derived from those external sources.

DPIA collection cribsheet v1

The first stage was to describe the processing operations at a relatively high level. The purpose of the processing, the legal basis and the relationships and information flows between Jisc and any third parties involved. We looked at what processing takes place under what circumstances, where and how. Then we looked at what data are processed and the types of data subjects who may be affected.

To justify the processing we need to look at whether it is necessary for the purpose (in the GDPR sense of “purpose cannot be achieve in any less intrusive way”) and then consider whether the benefits justify the processing, either against the general requirement that processing be proportionate or, where we are using legitimate interests as the lawful basis, under the stricter test considering risks to the rights and freedoms of individuals. Although guides seem to suggest doing this later, we found it simplest to also document at this stage how we deliver the various data subject rights that apply to the particular legal basis chosen.

Although CNIL suggests looking together at harms (as a result of breaches of confidentiality, integrity & availability) and threats (internal, external & environmental) we found it easier to do these separately. Much the same harms arise out of a breach no matter what its cause, so we assessed the impact to the rights of individuals that could arise from each different type of breach occurring at different points in the service. This gave us an initial assessment of high, medium or low impact for each type of breach.

We then looked at mitigations that would hinder each type of threat source. For example firewalls and access control mitigate breaches of all kinds by outsiders; role-based access, contracts and training mitigate breaches by insiders. Although a few mitigations (e.g. locating systems in secure, resilient, data centres) can reduce the impact of some kinds of breach, in most cases the mitigations work by reducing the likelihood of a breach occurring. The volume of data processed generally means that a breach would still have a high impact if it were to occur.

Reviewing the risks and mitigations, we concluded that both services did indeed have sufficient protections to reduce their risks to a low, acceptable, level. Some suggestions for improvement were made, and plans agreed to implement these and monitor ongoing compliance. Those documents are currently being finalised, but we hope to publish them soon.

One interesting distinction between the two services is that whereas for the SOC Jisc is the data controller, for the Learning Analytics Service we are a data processor on behalf of the organisations that upload their students’ data to the service and then contract with providers to run analyses on those data. The GDPR makes the data controller responsible for the DPIA, but the Article 29 Working Party confirm that information from any data processors can assist them in doing so. This matches our experience that our two DPIAs have a different focus. When acting as a data controller, our DPIA can conclude that all the obligations of the GDPR will be met. As a processor, our main role is instead to provide the tools and information that data controller organisations can use to meet those obligations. For example as a processor we can design appropriate access control technologies and tools to support the exercise of data subject rights, but for these to adequately protect individual rights, the data controller organisations must still manage the access credentials appropriately and provide students with ways to exercise their rights.

Finally, the GDPR recommends that consultation with stakeholders should form part of the DPIA. Both of our services had already been the subject of extensive stakeholder consultation on privacy and compliance issues. It also seemed likely that further consultation would be more productive if based on an existing risk assessment. We are therefore planning to use these DPIA documents, along with additional DPIA guidance that may have become available, as a starting point for consultation in the next DPIA cycle for each service.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *