Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Categories
Articles

Explaining AI algorithms

One of the concerns commonly raised for Artificial Intelligence is that it may not be clear how a system reached its conclusion from the input data. The same could well be said of human decision makers: AI at least lets us choose an approach based on the kind of explainability we want. Discussions at last week’s Ethical AI in HE meeting revealed several different options:

  • When we are making decisions such as awarding bursaries to students, regulators may well want to know in advance that those decisions will always be made fairly, based on the data available to them. This kind of ex ante explainability seems likely to be the most demanding, probably restricting the choice of algorithm to those using known (and meaningful to humans) parameters to convert inputs to outputs;
  • Conversely for decisions such as which course to recommend to a student, the focus is likely to be explaining to the individual affected which characteristics led to that decision being reached. Here it may be possible to use more complex models, so long as it’s possible to perform some sort of retrospective sensitivity analysis (for example using the LIME approach) to discover which characteristics of the particular individual had most weight in the recommendation that was provided for them;
  • A variant of the previous type occurs where a student’s future performance has been predicted and they, and their teachers, want to know how to improve it. This is likely to require a combination of information from the algorithm with human knowledge about the individual and their progress;
  • Finally there are algorithms – for example deciding which applicants are shown social medial adverts – where the only test of the algorithm is whether it delivers the planned results and we don’t care how it achieved that.

Explainability won’t be the only factor in our choice of algorithms: speed and accuracy are obvious other factors. But it may well carry some weight in deciding the most appropriate techniques to use in particular applications.

Finally it’s interesting to compare these requirements of the educational context with the “right to explanation” contained in the General Data Protection Regulation and discussed on page 14 of in the Article 29 Working Party’s draft Guidance. It seems that the education’s requirements for explainability may be significantly wider and more complex.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *