The ICO’s Age Appropriate Design Code (more familiarly the “Children’s Code”) may have been written before lockdown, but it could provide useful guidance to everyone designing or implementing systems for the post-COVID world. We’re all trying to work out what a “hybrid” world should look like, whether in schools, colleges, universities, workplaces or social spaces. A Code that helps us provide respectful digital systems should be relevant to all these and more.
It’s also worth remembering that, as a statutory Code, the guidance fits within the legal requirements of the GDPR. It doesn’t create new law. It may highlight features that are particularly important when working with children, but only to suggest how to comply with what is the law for users of all ages. And which of us adults wouldn’t welcome digital services that offered clear explanations of what they were doing, had respectful default settings, and didn’t try to push the boundaries of the law?
The ICO’s description of being “‘datafied’ with companies and organisations recording many thousands of data points about [you]. These can range from details about [your] mood and [your] friendships to what time [you] woke up and when [you] went to bed” is creepy whether you are 8 or 80, not something that suddenly becomes acceptable on a particular birthday.
The Code sets out 15 “technology-neutral design principles and practical privacy features”:
- Best interests of the [individual]: which should already be our focus in education;
- Data Protection Impact Assessments (DPIAs): to help understand the risks and how we might mitigate them; we’ve also found them a really useful way to show individuals that we are taking care of them and their data;
- Age-appropriate application: provide services and systems that meet the needs and safeguards of your users (the ICO specifically notes that you can “apply th[is] standard to all your users” if you aren’t sure of their ages or needs);
- Transparency: using clear language – accessible to the likely audience(s) – to explain what we are doing with data (a GDPR requirement in any case);
- Detrimental use of data: the ICO provides an indicative list of these, with industry sector standards to be aware of;
- Policies and community standards: acting in accordance with our statements, and our community’s expectations;
- Default settings: it’s not just children who “will just accept whatever default settings you provide”, that’s why “privacy by default” is a GDPR requirement;
- Data minimisation: collecting and using only the minimum data you need, and for the time you need it; again, a GDPR requirement for everyone;
- Data sharing: arguably a principle that is stricter for adults than for children – there may be lawful reasons to share children’s data with parents & guardians, or in emergencies, but make sure you know what they are and have well-defined processes for them;
- Geolocation: location data has always been regarded as particularly sensitive (for everyone) under ePrivacy laws, but has never made it explicitly into GDPR. This is a reminder of that sensitivity;
- Parental controls: adults might want to control these for themselves, but most people still have things they would prefer not to encounter accidentally;
- Profiling: again, the GDPR warnings about profiling apply to users of all ages. The Code suggests that it should be off by default;
- Nudge techniques, or “exploitation of human psychological bias”: should not be deployed to encourage users to make bad (for them) choices;
- Connected toys and devices: remember that users may forget that it’s not just a toy/smart speaker/fitness device/etc. but also a powerful data collection device;
- Online tools: that make it easy for individuals to exercise their data protection rights.
Which of these would you want to reserve only to children, and not want in services designed for adults too?
An example in the ICO’s Frequently Asked Questions highlights that the principles aren’t bans, but areas to think carefully about. Using geo-location to provide information relevant to an international student’s country is fine, but let them choose whether to give you access to location data, and stop using that access as soon as you have the information you need.
In a few cases it may be reasonable to expect adults to (slightly) better understand the consequences of their choices. But distracted post-lockdown adults will still be grateful for clear explanations and services that just do the right thing.