Categories
Presentations

Laws to help security and incident response

Last week I was invited to be a member of a panel at the UN Internet Governance Forum on how law can help security and incident response and, in particular, information sharing. It seems there are still concerns in some places that privacy law is getting in the way of these essential functions.

I started from how bad things would be if it were actually against the law to share information for security and incident response. Patches and anti-malware systems would be slower to arrive, since those often require sharing personal data. So our systems – PCs, phones, security cameras, baby alarms – would be vulnerable to attack for (much) longer than they are now. Reporting of attacks would be nearly impossible, since that almost always requires sharing of details of attacking computers, accounts or websites. And, of course, it would be illegal to inform victims, so once their systems had fallen under the control of malicious outsiders, they’d stay that way. Eastern Berlin seemed a particularly apt place to be discussing such a nightmare scenario for privacy.

From that counter-example I derived four characteristics that Internet defenders need in a law on information sharing:

  • It has to be part of a privacy law, which recognises that insecurity is among the biggest threats to privacy, and contains internal checks and balances to protect both defenders and victims;
  • It has to be clear, so defenders can get on with defending and not spend scarce resource worrying about legality, or (apparently still a concern in some quarters) whether they need attackers’ consent to send out warnings;
  • It has to be limited to what is necessary, and require defenders to demonstrate that their activities have a net benefit for internet security and individual users’ privacy;
  • It has to have a broad scope – covering at least the operators of networks and connected systems, vendors, researchers and users – so that changes in legal regime don’t create unnecessary barriers to sharing between those groups.

It seems to me that Recital 49 of the General Data Protection Regulation gets pretty close to those requirements. We were also asked about regimes elsewhere, such as those based on the Council of Europe’s Convention 108. Interestingly the European Court managed to infer something quite close to Recital 49 when examining the earlier Data Protection Directive in the case of Breyer v Germany, so even if your local law doesn’t contain an explicit Recital 49 equivalent, similar information sharing practices may still be OK.

I had also prepared some notes on how those drafting laws in this area might do even better than Recital 49:

  • Although Recital 49 talks about “processing necessary for security” it doesn’t explicitly mention information sharing. You need to follow the trail through “legitimate interests” in Article 6(1)(f) to discover the safeguards under which that is permitted;
  • The GDPR as a whole still creates some puzzles, even impossibilities, by combining a very wide definition of “personal data” (including IP and MAC addresses) with binary obligations on anyone processing “personal data”. No matter how much the law may demand that you proactively contact someone whose IP or MAC address you have, that may be technically impossible;
  • These, and other issues, still raise the temptation to argue that “an IP address is a machine not a person”, so that privacy law shouldn’t apply. Security and incident response teams shouldn’t need to rely on legal quibbles to make their actions lawful: they should be able to embrace privacy law wholeheartedly and show how their actions are essential to comply with it (here the encouragement from the Article 29 Working Party to “put in place processes to be able to detect and promptly contain a breach” is particularly helpful).

Done right, privacy and security should be very much on the same side. Both need to keep information secure from malicious actors to achieve their purpose. And both are damaged every time a computer falls under the control of such actors.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *