Categories
Articles

Hints at ICO approach to AI

It’s interesting to see the (UK) ICO’s response to the (EU) consultation on an AI Act​​​​​​. The EU proposal won’t directly affect us, post-Brexit, but it seems reasonable to assume that where the ICO “supports the proposal”, we’ll see pretty similar policies here. Three of those seem directly relevant to education:

  • That remote biometric identification for non-law-enforcement purposes is high risk (para 26). This is unsurprising, given the ICO’s past comments that live face recognition (LFR) represents a step-change in invasiveness. But the EU proposal isn’t limited to LFR in public spaces: any processing of live (or near-live) video to try to identify or classify faces is likely to be covered.
  • That managing high-risk deployments needs a “continuous iterative process” (para 19). The EU proposal stated (again not binding on the UK, but may be a strong hint) that high-stakes assessments in education (e.g. course entry/selection) are high-risk places to use AI.
  • And (para 21) that there should be a public registry for high-risk AI systems “in particular for systems deployed in the public sector”. I don’t know whether they count education as “public sector” for this purpose – neither of the commonly-cited registers in Amsterdam or Helsinki seem to include educational deployments at the moment – but Helsinki does include a couple of library management applications and three chatbots.

Incidentally, Helsinki’s register may actually go further than the EU proposal. Two of the chatbots are deployed in health contexts, so do have an obvious reason for high-risk categorisation. But I can’t see why other three applications (parking information, book recommendations and shelf management) would meet that threshold. The draft EU Act actually suggests that chatbots would normally qualify as low (but not no)-risk: any register that lists every use of natural language processing is going to need some good navigation tools to find the few high-risk applications in amongst all the mostly-harmless ones.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *