Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Categories
Articles

AI, Consent and the Social Contract

“Consent” is a word with many meanings. In data protection it’s something like “a signal that an individual agrees to data being used”. But in political theory “consent to be governed” is something very different. A panel at the PrivSec Global conference suggested that the latter – also referred to as the “social contract” – might be an interesting lens through which to consider the use of data, algorithms and Artificial Intelligence. The basic idea is that creating a society involves a balance: we give up some individual freedoms (for example to choose which side, and how fast, to drive on the roads; or to take whatever property we choose) in order to create a communal life that works for everyone.

So how does that help in discussing new technology? How can we create technologies that enhance humanity rather than exploiting our weaknesses? First is the idea that a valid social contract must include everyone, it can’t be imposed by those with (current) political, technological or economic power. All views, impacts and situations need to be considered and weighed. Which means we need to make that discussion accessible, particularly to children and the (digitally) vulnerable. That may actually be easier if we debate principles, rather than technological details: “it’s too complicated” and unquestioned following of algorithmic outputs are signs that we’re getting the debate wrong. Complication, algorithms/goals and data sources are choices made by humans: we need to discuss whether and when it’s acceptable to make those choices. Above all, principles (and systems) must be designed for the vulnerable and, maybe, adapted by those with greater autonomy: not the other way around. Tools such as consequence scanning and ethically-inclined design can help us explore possible futures.

To claim this kind of consent, organisations must commit to putting the principles into practice, and their doing so must be monitored and publicly reported on. As in many fields, without “trust but verify” there will be a natural tendency to creep into loopholes. Data Protection Officers may be the first layer in this verification, but their burden of maintaining independence and capability is likely to need external support and reinforcement. And we must beware of confusing adoption with acceptance. Something that is convenient but resented is not part of the social contract and should not be read as such. Popularity creates a particular risk: that widespread reluctant adoption may squeeze out the alternatives that would be a better fit for the social contract. The difficulty of buying paper street maps of major cities (“everyone uses their phone”) was mentioned. Bringing new technologies within the social contract won’t be quick or easy, but doing so should reduce the risk of individual harm or resistance, and of future “techlash” by parts or all of society.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *