Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Categories
Articles

Onward from Learning Analytics

This morning’s “multiplier event” from the Onward from Learning Analytics (OfLA) project highlighted the importance of human and institutional aspects in a productive LA deployment. They begin at the end – what is the desired outcome of your LA deployment? The answer probably isn’t “a business intelligence report”, and almost certainly not “a dashboard”. Starting from “a one-to-one conversation between a student and their personal tutor” gives a much richer perspective.

That approach makes clear the importance of tutor and student preparedness: are tutors confident of having those conversations, do they know when and how to hand over to others, and what kind of conversations do students find most helpful? The project has developed online tools for the first two of those: to help tutors explore the most appropriate timing, messenger, medium, content and follow-up for their interventions, and to help them do “warm referrals” where recommending the student talks to other experts is seen as supportive, rather than a brush-off.

Striking the right note for an individual student is hard, since it will depend on many factors, some of which are (at least initially) unknown to the tutor. For this reason, however tempting it may be, it’s probably not a good idea to try to shock students into working harder: there are just too many possible reasons why they may not appear to be making the expected progress. Sharing data with the student also needs to be handled with care: among other negative responses, too many graphs may be seen as obfuscating the message, comparison with benchmarks may be demotivating. One productive approach is to present “the system” as an antagonist, and invite the student to collaborate with the tutor in changing behaviour so as to confound its expectations. More generally: data can be part of conversations, but it must not be at their core.

This greater understanding of what comes after the application of learning analytics technology should then inform what comes before. LA purpose(s) that align with the institution’s mission are much more likely to be supported. The answers to “Why do we need an LA platform? Who are we trying to help?” lead naturally to answers to “who needs to access it?”, “what data presentations will be helpful?” and “what data literacy will users need?”. These, in turn, help derive requirements for both systems and data.

In this light, identifying a small set of effective data sources becomes an operational requirement as much as a legal one: too many sources make it hard to explain how students’ difficulties can be addressed. Human understanding is essential: one algorithm identified “enrolment status” as a (statistically) strong indicator of outcome. Well, yes! An interesting idea was to use transparency as an operational requirement/test: if students can’t explain the system to each other then it’s too complex. Reducing the number of sources also reduces the work needed to maintain data quality and ensure that changes in collection or systems don’t disrupt the student support purpose. The project has an excellent info-graphic on these policy issues, which should be available on their website soon.

Finally, any application of learning analytics involves many trade-offs. Early interventions will be less accurate than waiting for more data, but intervening the day before a student obtains a poor grade isn’t helpful. Ease of use tends to increase automation, which reduces both student and staff autonomy. And there is no guaranteed right way to communicate bad news. But a multi-layered approach that covers everything from data to process, presentation and literacy, seems to provide the best opportunity to adapt to circumstances and bring each alert to a satisfactory conclusion.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *