Categories
Articles

Sandbox Tales – Information Sharing Platforms

The latest reports from the ICO sandbox provide important clarification of how data protection law applies to, and can guide, the application of novel technologies. This post looks at information sharing…

FutureFlow’s Transaction Monitoring and Forensic Analysis Platform lets financial institutions such as banks upload pseudonymised transaction data to a common platform where they, regulators and other agencies can look for patterns across the combined data set to detect and investigate “unusual behaviours and transaction patterns” that may indicate financial crime. This looks a lot like the information sharing platforms used by computer security incident response teams (CSIRTs) so it’s good to see that the sandbox report largely supports the legal model (explained in detail by the MISP project) that those have been using.

First, although uploaded data are typically pseudonyms – in FutureFlow’s case through standardised hashing of identifying data, in CSIRTs because most data are associated with pseudonyms such as IP addresses – they should not be treated as anonymous. Pseudonymisation reduces the identifiability of data but these datasets are sufficiently rich that a “motivated intruder” might still be able to identify individuals if they were to gain access. Data Protection law applies.

The first question is therefore which parties are Data Controllers and which Data Processors. Institutions that upload data into the platform (and presumably those that access and download it, if different) are data controllers, since they decide which data to upload and what purposes (within the platform’s technical and policy limits) to use it for. Where, as in in FutureFlow’s case, the platform operator does neither, it is likely to be a Data Processor. This suggests that in federated sharing platforms, where contributors can run their own instance of the platform and link it to others, the contributor function would dominate and those organisations would be Data Controllers. But a platform operator – like FutureFlow – that merely develops algorithms and runs them on others’ data may be a Data Processor.

Interestingly, although GDPR Article 35 only makes Data Protection Impact Assessments (DPIAs) a requirement for Data Controllers, the sandbox report suggests that a DPIA might be a good way for a Data Processor to document the risks it has considered and the security measures it has adopted to manage them (this is the approach that Jisc has taken with its Learning Analytics platform DPIA).

Using a platform to share personal data requires a legal basis, chosen from GDPR Article 6. Although financial institutions may have a legal obligation to prevent fraud, the FutureFlow report suggests that “necessary for legal obligation” (Article 6(1)(c)) is probably not the best choice. This is because the sharing platform “demonstrates maximum effectiveness when applied to a broad account base, prior to any firm indication that any accounts have been involved in suspicious activity”. Or, as we tend to express it in incident response, to identify malicious anomalies you need to know what normal looks like. At this “pre-suspicion” state, “necessary for a legitimate interest” (Article 6(1)(f)) is more appropriate, and brings the additional reassurance that each Data Controller must ensure its use of data is not just legitimate (for computer and network security this is likely to invoke GDPR Recital 49), but is not over-ridden by the impact on the rights and freedoms of the individuals involved. The platform’s DPIA should be useful in performing this balancing test.

Finally, one difference between FutureFlow and CSIRTs sharing platforms is that FutureFlow only appears to let institutions see their own data, with automated flags added by the platform. Incident response – at least at present – typically relies on more manual investigation, with participants likely to have some access to data uploaded by others. To reassure contributors that this increased risk is mitigated, platforms and the communities that use them may need to supplement technical and operational security measures with policies and/or legal agreements that ensure uploaded data will only be used in ways and for purposes intended. Where the platform offers a DPIA, such measures should be included there.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *