Categories
Presentations

Signposts for ethics in immersive technologies

[What I meant to say at the Westminster e-Forum on Immersive Technologies]

Here we have some creepy applications of immersive technologies. Body-cameras and mobile phone apps that scan every passing face and search for anything they can find out about their identities and contacts… Incidentally, I’ve no idea what the smiley faces on the body-cam mean, but can you think of a less appropriate marker for someone “Wanted for felony…”?

And here we have some inspirational ones. Virtual fieldtrips, students preparing to work safely in hazardous environments. Similar technology is also being used to train those building and maintaining nuclear power stations and working in food processing plants. I could also have added students collaborating to develop ideas for more environmentally-friendly workplaces, students being assessed on their practical knowledge and skill in anatomy, researchers using VR to design new drugs,…

So what’s the difference, and how do you ensure you build the latter, not the former?

Legal compliance should go without saying: safety, discrimination and data protection law in particular. And the General Data Protection Regulation’s guidance on designing Fair, Transparent and Accountable systems is good, whether you are using personal data or not.

But legality isn’t enough. Some lawful applications, like (at least at present) face recognition, are nonetheless very creepy; as are some applications that don’t seem to involve personal data at all. Racist hand-dryers, for example. So what more do we need?

I’d suggest it’s about “respect”, in three different ways:

First, respect for equality. Not just in the sense of non-discrimination, but equality of arms. By all means use immersive technologies to assist expert surgeons; but not to create inequalities or broaden digital divides. Instead they should be used to increase opportunity for all. Why can’t I have the best scientist on the planet as my buddy in VR-space?

I think this is why the face-scanning body-cam offends. An officer on foot is claiming equality: someone I can have a conversation with. A camera recording that interaction is for both our safety. But face recognition goes back to the dominant and controlling position of an officer on a horse or in a car.

Second, respect for context. This is Helen Nissenbaum’s idea that spaces and situations have implicit expectations as to data purposes and flows. In a conference room we wear name badges; here, perhaps, it might be acceptable to augment the Chair’s memory with a University Challenge-style voiceover when someone raises a hand. Incidentally, this illustrates that AR doesn’t have to be limited to augmenting vision. Outside, AR should not be making each of us expose our LinkedIn contacts on a virtual sandwich board. Incidentally, context is probably more of a challenge for AR, which intrudes into existing contexts, whereas VR defines its own.

Third, respect for humanity. I was going to say respect for rights, but it’s more than that. Don’t create superpowers, addicts, or release Pokemon Go characters into crowded streets. And think carefully before interfering with practical obscurity – what we don’t know is as at least as important for our sociability as what we do. Panopticons are for punishment.

In summary, if you respect us, then we are more likely to respect you as making a positive contribution to society, not shun you and your users. Comics and movies have known for nearly a century that superheroes are likely to be social misfits: let’s not make that our technological reality.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *