Categories
Articles

Law of the (AI) Horse?

When the Internet first came to legislators’ notice, there was a tendency to propose all-encompassing “laws of internet” for this apparently new domain. A celebrated paper by Frank Easterbrook argued that (my summary) there wasn’t a separate body of new harms to address and that existing laws might well prove sufficiently flexible to deal with many of them. The title pointed out that studying (or creating) the “law of the horse” would ignore a lot of the legal and social principles that are already widely established. Looking at proposals for “AI laws”, I wonder whether we might be back in similar territory?

The proposed EU AI Act doesn’t seem self-confident. First, it has to define “AI”, then it declares that most of that definition doesn’t need regulating anyway and, for the rest, proposes something that looks a lot like a traditional product safety law. The Act is already being criticized for an over-simplified view of supply chains. Perhaps starting with a scope that encompasses everything from speech recognition to probation recommendations was too ambitious? Lack of an AI law doesn’t seem to have hindered courts, which have applied everything from data protection to discrimination laws to reach apparently satisfactory conclusions to harms caused by AI. A very different approach is taken by the proposed EU AI Liability Directive: rather than creating new laws this suggests how existing ones might be applied in complex AI supply chains.

So, for both legislators and developers of new technologies, the message seems to be to check how existing laws will apply. If that doesn’t seem right, try to work out an interpretation that fills the gap (or addresses any genuinely new harms) in the spirit and objectives of the existing rules. A recent review of “The Law of the Horse” considers this in more detail. For those developing or applying “AI”, make sure you understand how existing laws on personal data, discrimination and safety will apply to your idea. You may well find more guidance there than you expect.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *