Categories
Presentations

Is our technology comforting?

When I was invited to join a panel at the SOCITM ShareNational event for local government I presumed my role was to provide a different, external, perspective on “Ethical Use of Emerging Technologies and Data”. So I offered to contribute a five-minute “sparkler” introduction: a bit of illumination, some striking of ideas, maybe a smile. In fact, the conference programme was already buzzing with new thinking (there will be lots more blog posts next week), so I didn’t need to add to that. Here’s what I would have said…


Over the past decade or so my job has been to work with Jisc colleagues to ensure the networking and analytics services they want to provide are “lawful”. More recently, they’ve been asking about “ethical”. But maybe what we actually need is “comforting”. That’s different, because it’s about how others perceive those services, not how we think about them.

We need to talk…

A couple of phrases I don’t find comforting. “For the Greater Good”. So why are you throwing me under a bus? Aren’t there sufficient ways to use technology and data that benefit everyone?

Perhaps more surprisingly. “Individual Control”. Why would that be important if the proposal is a no-brainer? Too often “individual control” is a sign of laziness – we can’t work it out, you do it. We should be working together to find those no-brainers; then use individual control to find and address situations that couldn’t have been foreseen, not as a belated acceptance test to discover we built the wrong thing.

So be careful about digital volunteering. It might help, but it might also amplify digital divides into visibility divides. I was shocked to discover that when I assume “students have smartphones” I’m actually missing 1 in 6.

An example: I love the idea of an app that uses phone accelerometers to detect and report potholes. So cool! But… What about drivers who don’t have phones? What about cyclists, pedestrians and residents, who may welcome potholes as informal traffic-calming measures? Does that app actually identify desire lines and divert scarce resources to building smooth rat-runs?

I’ve been using four questions to explore new ideas for data and technology.

  • Will it help? Do we have a process that would be improved by any signal the data might produce?
  • Will it work? Will there be a sufficiently accurate signal, or am I relying on false assumptions about technology or behaviour?
  • Will it comfort? Or will I be perceived as Big Brother?
  • Will it fly? What are the broader effects on the community and society?

As a colleague observed, after exam results and virus testing, probably more people than ever before now have personal experience of the discomfort caused by inappropriate use of data. If we can use that engagement opportunity to move the discourse from “mutant algorithms” to pride in how our community uses technology then we’ll have salvaged something really valuable from 2020.


[UPDATE 8/1/21: the “four questions”, and how they might be used in practice to assess ideas for data (re)use, became the subject of a peer-reviewed paper – “Between the Devil and the Deep Blue Sea (of Data)” – published by the Journal of Law, Technology and Trust]

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *