This morning’s “multiplier event” from the Onward from Learning Analytics (OfLA) project highlighted the importance of human and institutional aspects in a productive LA deployment. They begin at the end – what is the desired outcome of your LA deployment? The answer probably isn’t “a business intelligence report”, and almost certainly not “a dashboard”. Starting from “a one-to-one conversation between a student and their personal tutor” gives a much richer perspective.
That approach makes clear the importance of tutor and student preparedness: are tutors confident of having those conversations, do they know when and how to hand over to others, and what kind of conversations do students find most helpful? The project has developed online tools for the first two of those: to help tutors explore the most appropriate timing, messenger, medium, content and follow-up for their interventions, and to help them do “warm referrals” where recommending the student talks to other experts is seen as supportive, rather than a brush-off.
Striking the right note for an individual student is hard, since it will depend on many factors, some of which are (at least initially) unknown to the tutor. For this reason, however tempting it may be, it’s probably not a good idea to try to shock students into working harder: there are just too many possible reasons why they may not appear to be making the expected progress. Sharing data with the student also needs to be handled with care: among other negative responses, too many graphs may be seen as obfuscating the message, comparison with benchmarks may be demotivating. One productive approach is to present “the system” as an antagonist, and invite the student to collaborate with the tutor in changing behaviour so as to confound its expectations. More generally: data can be part of conversations, but it must not be at their core.
This greater understanding of what comes after the application of learning analytics technology should then inform what comes before. LA purpose(s) that align with the institution’s mission are much more likely to be supported. The answers to “Why do we need an LA platform? Who are we trying to help?” lead naturally to answers to “who needs to access it?”, “what data presentations will be helpful?” and “what data literacy will users need?”. These, in turn, help derive requirements for both systems and data.
In this light, identifying a small set of effective data sources becomes an operational requirement as much as a legal one: too many sources make it hard to explain how students’ difficulties can be addressed. Human understanding is essential: one algorithm identified “enrolment status” as a (statistically) strong indicator of outcome. Well, yes! An interesting idea was to use transparency as an operational requirement/test: if students can’t explain the system to each other then it’s too complex. Reducing the number of sources also reduces the work needed to maintain data quality and ensure that changes in collection or systems don’t disrupt the student support purpose. The project has an excellent info-graphic on these policy issues, which should be available on their website soon.
Finally, any application of learning analytics involves many trade-offs. Early interventions will be less accurate than waiting for more data, but intervening the day before a student obtains a poor grade isn’t helpful. Ease of use tends to increase automation, which reduces both student and staff autonomy. And there is no guaranteed right way to communicate bad news. But a multi-layered approach that covers everything from data to process, presentation and literacy, seems to provide the best opportunity to adapt to circumstances and bring each alert to a satisfactory conclusion.