Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Categories
Presentations

How to Start Learning Analytics?

One of my guidelines for when consent may be an appropriate basis for processing personal data is whether the individual is able to lie or walk away. If they can, then that practical possibility may indicate a legal possibility too.

When we’re using learning analytics, as a production service, to identify when students could benefit from some sort of personalisation of their learning experience, that’s not what we want. Those opportunities should be offered to all students who might benefit from them, with the option to refuse when they know exactly what alteration or intervention is being proposed. Hence Jisc’s recommended model uses consent only at the point of intervention (and, by the same “can lie” test, if we are inviting students to provide self-declared input data into our models).

Legally, and morally too, if we are imposing processing on individuals then we need to ensure that it doesn’t create unjustified risks for them. Doing that shouldn’t be a problem when we know what objective we are aiming at and what information is likely to be relevant to that objective. However this creates a chicken/egg problem: how do we find out what objectives are possible and what data might help with them?

For this sort of exploratory investigation, consent may be a more appropriate option. At this preliminary stage inclusiveness may be less important (though we need to beware of self-selecting inputs producing biased models) and we may indeed be able to offer the option to walk away at any time. Participants who do so must not suffer any detriment: one way to ensure this, and to satisfy the requirement that individuals must know the detailed consequences of participation, is to state that the outputs from pilot systems will not be used for any decisions, or to offer any interventions. So no consequences and no detriment. Learning which types of data can inform which types of outputs should be sufficient for the pilot stage: we can then use that knowledge to assess and implement our production algorithms and processes.

These thoughts were explored in my talk at the Jisc Learning Analytics Network meeting in November

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *