Categories
Articles

“Digital Ethics”, or Ethical Digitalisation?

Perhaps surprisingly – given that its title was “Digital ethics” – last week’s SOCITM panel session spent a lot of  time exploring things that aren’t “digital”. Although the discussion focussed on local government, a lot of the ideas seemed relevant to education, too.

Don’t be solutionist: technology might not be the right option.

When identifying issues, might a survey be more effective than using “big data”? It’s probably is less privacy invasive and – unlike using existing data – you can work to make sure the results represent all of the affected community. Even in universities and colleges, disparate access to – and confidence with – digital devices means that data from digital services will not be representative. Working out how unrepresentative after data are collected may well be more effort than gathering a representative sample in the first place. And engaging with affected people opens up discussion about the actual reasons for behaviour, rather than assuming those can be inferred from observations.

When proposing solutions, again, actively seek out and work with those who will be affected. Start grounded and simple, both in the aims and how they are communicated. Take as much time as is needed to explain and understand: time spent at this stage should be more than recouped later in the development. And don’t be simplistic. Be open about risks, and invite dissenting views. Reduce the number of unanticipated (by you) problems at this stage, not after you have implemented the system. Late-2020 is a good opportunity to escape the idea that technology is, or can be, perfect! Be honest about the objectives – cost reduction or staff redeployment may be OK, but don’t dress them up as something else. Know how progress against those goals will be tested, and what will be done if things don’t work out as hoped.

When implementing solutions (whether digital or not) ensure this is done by a multi-disciplinary team, not just statisticians and technologists. And make sure this is genuine engagement, not just a tickbox. Even if you don’t formally adopt Agile methods, try to test and learn during development, not just after. And capture and share the lessons of failures, not just successes: the former are at least as valuable. Be willing to learn from, and build on, other communities, tools, and resources: starting from scratch shouldn’t be necessary. Every process will need to include some facility for human contact, if only to detect, remedy and fix when “computer says no”. And be open about when it is a computer – Amsterdam’s new “Algorithm Register” is an interesting approach, though as it grows I suspect it will need to become more granular to avoid enforcement support systems getting lost in a swarm of chatbots.

Don’t (just) think “digital ethics”: think “ethical process change”.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *