How to Start Learning Analytics?

One of my guidelines for when consent may be an appropriate basis for processing personal data is whether the individual is able to lie or walk away. If they can, then that practical possibility may indicate a legal possibility too.

When we’re using learning analytics, as a production service, to identify when students could benefit from some sort of personalisation of their learning experience, that’s not what we want. Those opportunities should be offered to all students who might benefit from them, with the option to refuse when they know exactly what alteration or intervention is being proposed. Hence Jisc’s recommended model uses consent only at the point of intervention (and, by the same “can lie” test, if we are inviting students to provide self-declared input data into our models).

Legally, and morally too, if we are imposing processing on individuals then we need to ensure that it doesn’t create unjustified risks for them. Doing that shouldn’t be a problem when we know what objective we are aiming at and what information is likely to be relevant to that objective. However this creates a chicken/egg problem: how do we find out what objectives are possible and what data might help with them?

For this sort of exploratory investigation, consent may be a more appropriate option. At this preliminary stage inclusiveness may be less important (though we need to beware of self-selecting inputs producing biased models) and we may indeed be able to offer the option to walk away at any time. Participants who do so must not suffer any detriment: one way to ensure this, and to satisfy the requirement that individuals must know the detailed consequences of participation, is to state that the outputs from pilot systems will not be used for any decisions, or to offer any interventions. So no consequences and no detriment. Learning which types of data can inform which types of outputs should be sufficient for the pilot stage: we can then use that knowledge to assess and implement our production algorithms and processes.

These thoughts were explored in my talk at the Jisc Learning Analytics Network meeting in November

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *