Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Categories
Articles

Privacy Enhancing Technologies: ICO draft guidance

The latest draft part of the ICOs guidance on data protection technologies covers Privacy Enhancing Technologies (PETs). This is a useful return to a topic covered in a very early factsheet, informed both by technical developments and a better understanding of how technologies can (and cannot) contribute to data protection.

Perhaps the most important message is in the very first section. All the technologies can help to reduce risk – both to data subjects and data controllers – but very few will change personal data into anonymous data. Data Protection law still applies, both to the application of PETs and to their results. Thoughtfully used, PETs can contribute, in particular, to data minimisation, security and risk reduction, making existing processing safer and, sometimes, permitting processing that would otherwise involve too high a risk.

Conversely, PETs can increase risk if used inappropriately. In particular, most privacy-enhancing technologies rely on the privacy-enhancing organisational measures and processes that surround them. Weaknesses (or misplaced trust) in these organisational measures can undermine the protection provided by the technology, or even increase the privacy risk if they increase the scope or duration of access to personal data. This makes PETs hard to add retrospectively – they are best incorporated at the design stage, where tools such as Data Protection by Design and Data Protection Impact Assessments can provide the breadth of analysis required.

Unlike the earlier guidance, the discussion of specific PETs assumes that basic security and minimisation measures have already been applied. There is no discussion of encrypted storage and transmission or pseudonymisation for example: these should now be routine considerations for all data controllers. The division of PETs into three classes provides a useful framework:

  • Deriving or generating data which reduces or removes identifiability (the latter the only group whose product might fall outside the definition of personal data);
  • Hiding or shielding data;
  • Splitting or controlling access to certain parts of data.

Familiar technologies (e.g. statistics, encryption and key-coding, respectively) can contribute to all of these, and should be used where possible.

The remainder of the guidance considers individual technologies in each of these categories. Most are still active topics of computer science research, so likely to be suitable only for exploration by technologically advanced data controllers. Oddly the sequence in which they are presented – Homomorphic Encryption, Secure Multi-Party Computation, Private Set Intersection, Federated Learning, Trusted Execution Environments, Zero-Knowledge Proofs, Differential Privacy and Synthetic Data – doesn’t seem to match either the document’s own categories or my impression of how close to production use they are. Synthetic data and differential privacy are the ones I’d expect to be considering first.

The document is a draft for consultation: feedback to the ICO is welcome.

 

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *