Categories
Presentations

Opportunities and Choices: Digital Student Records and Privacy

I was recently invited by the Groningen Declaration Network to join a panel discussing privacy issues around the exchange of digital student records. Like the discussion, this summary is a collaborative effort by the panel team.

Two main use cases were discussed during the meeting: transferring records between education institutions when students apply to or take courses at other institutions, and providing statements of student achievement to support their applications for jobs or other appointments. There is an increasing need for these to work internationally: the Organisation for Economic Cooperation and Development expects eight million students to be studying outside their home country by 2025 and some countries already have more than 20% of their student body coming from overseas. The Dutch education ministry estimates that each overseas application costs €450 to process.

Opportunities

The motivation for the Groningen Declaration Network was recognition that existing paper-based processes are unsatisfactory: inefficient and with significant risks. Transferred paper records can go astray in the post or be entered incorrectly (even associated with the wrong student) when they arrive; the more people and processes involved, the greater an opportunity for errors.  Transcripts may be misinterpreted or may not contain the information the recipient needs. Degree certificates can be copied or forged, damaging the reputations both of the issuing organisations and those students who obtained their qualifications legitimately. If the information were transferred in digital form then existing technologies might offer ways to adjust processes and ‘documents’ to significantly reduce these problems. For example:

  • A digitally-signed degree certificate could be effectively unforgeable: non-graduates couldn’t claim to have a degree and graduates couldn’t improve their own results. This should also avoid endless processes where an employer asks for a letter confirming the validity of a certificate, then requests confirmation that the validity letter was itself genuine… (apparently this does happen!);
  • Digitally-signed certificates can be relied upon even if they are not obtained directly from the authoritative source; they can be verified without contacting the issuing institution every time. This could allow new patterns of information flow, for example a graduate might not have to reveal to their university just how many job applications they have made;
  • Digitally-expressed qualifications could provide different granularities of information. Some processes may only need to know that a qualification was awarded, others may need more detail or supporting information. Stanford University is developing PDF transcripts that include explanations of course content and links to the student’s work in an institutional repository. Transferring students between educational institutions may need even more detailed information, for example to allow a ‘home’ institution to award appropriate credit for study elsewhere;
  • Digital transfers of information should be less likely than paper to go astray; organisations and individuals could use cryptographic techniques to ensure they are delivered to, and only readable by, the intended recipient.

As well as these benefits for privacy and efficiency, the effect of digital systems on existing threats to privacy needs to be considered. For example:

  • Digital systems for storing or issuing information need to be protected against attack by technical means, particularly if they can be accessed and attacked over the Internet. A large collection of digital information may increase the impact of a successful attack by exposing more records, however security measures are often more efficient at large scale so those running large systems should also be more able to implement appropriate protection. Conversely if digital qualifications are held by individuals it may be challenging to identify a format that is both safe on consumer devices and does not suffer from compatibility issues;
  • Existing paper systems are already subject to “blagging” attacks, where people with no right to see information attempt to persuade human operators to release it. Where organisations exchange digital information between their computer systems it should be possible to establish strong technical measures to ensure that this only occurs with proper authorisation, though such approaches need to cope with both large and small flows of information. The Erasmus programme includes around 4000 institutions, but some only exchange 25 students a year. Verification of certificates by third parties is more challenging, since it may not be possible for the institution to verify their identity or authority to receive the information. In these cases individual graduates may be better placed to control who they wish their information to be disclosed to;
  • Processes involving collections of “big data” can raise concerns that the holder of the information has different incentives to those of the individual data subjects. In the cases of student application/transfer and qualification verification the institution and the student appear to have a strong shared interest in making the process quick, accurate and efficient. Both processes will normally be triggered by some action by the student (e.g. applying for a place or job) so there is an opportunity to inform them of the resulting data transfers before they occur. A clear explanation of the obvious direct benefits for all parties should minimise concerns. If student/graduate information is used for other purposes these also need to be clearly explained and justified to avoid the risk that these may damage trust in the core functions of the digital system.

Choices

Digital systems have the potential to support many different architectures including central depositories, portals linking institutional systems, clearinghouses or information held by individuals. Different applications may well suit different architectures. Each architecture will also have its own implications for privacy – both intended and unintended – because of what the choice implies for what information is stored, exchanged and disclosed, and by whom. Complex architectures may run the risk of “ethical dilution”, as those further from the source may be less aware of the constraints on how it should be used. The implications of data, data flows and information about them (often referred to as “metadata” and carrying its own significant privacy issues) should be carefully considered early on in the design process.

For example, consider a credentials checking company that verifies the validity of student credentials on behalf of potential employers. If such a company represents many employers, it may learn a lot about the number and type of job applications by an individual. Government organisations may be interested in this type of data from a completely different perspective: social security. If, for instance, those that rely on social security must apply for a certain number of jobs to qualify for benefits and/or if they have to accept jobs below their level of training or education after a certain amount of time then accessing this kind of metadata may be tempting. Choosing instead to package digital student credentials as ‘tokens’ that can be validated without third party intervention might reduce the creation of this kind of metadata, but involve more work for employers and more inconvenience for users if they lose their token.

It can be hard to foresee which other uses data exchange – or metadata deriving from the data exchange – may have, especially if you are concentrating on functional goals. During the discussion we concluded that there is a parallel here with the different perspectives that software and protocol designers have from those taken by ethical hackers. Designers have a constructivist view on systems whereas ethical hackers look at systems with an eye for how they can exploit features of the system to gain elevated privileges or access data they should normally not be able to access. Perhaps, when considering the exchange of personally identifiable information an ethical hacker should also look at the proposal to consider what other uses the exchanged data (or metadata generated by the exchange) can have.

Conclusions

The conclusion of our discussions, both in the panel session and in informal conversations afterwards, suggested that technological tools exist that could improve the effectiveness, efficiency and privacy of exchanges of student data. The challenge is to identify which processes can obtain most benefit from the many technical possibilities. Although we should aim for inter-operability in the long term, it may still be too soon to commit to formal standards. New developments should, perhaps, be approached in the “skunkworks” style, with organisations being prepared to scrap or replace developments that turn out to be unsuccessful. Members of the Groningen Declaration Network are already conducting various pilot studies to identify promising areas and are committed to sharing the results of these at future meetings. It was suggested in particular that including stakeholders (including students) in these pilots might help identify approaches that are more likely to succeed.

The Network is also developing a set of privacy/ethical principles to inform its work, to ensure that students/graduates remain aware and in control of what is done with their information and that the information is treated appropriately by all those who have custody of it. Systems should never contain unpleasant surprises, but this should not depend solely on “notice and choice”. Ethical considerations may indicate that some options be excluded as a matter of principle, even if some users might be persuaded to agree to them. And, as the security field has discovered, asking users to “agree” too often trains them to be click-happy and not give due consideration to the choices that really matter.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *