Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Categories
Articles

Risk trade-offs? Or spirals?

A couple of recent discussions have mentioned “trade-offs” between risks. But I wonder whether that might sometimes be a misleading phrase: concealing dangers and perhaps even hiding opportunities? “Trade-off” makes me think of a see-saw – one end down, other up – which has a couple of implications. First, the two ends are in opposition; and second, we can always change our minds, change the weights, and things will go back to where they were.

But think about a real-world example from 2014: care.data. Here a risk was identified: that medical research would be limited by shortage of data from real patients. I’ve no idea if the proposed solution was thought of as a “trade-off”. But making patient data from their family doctors available to a central research service was seen by many people as increasing the risk to their personal privacy. Individuals could ask their doctors not to transfer their data. But doctors identified another risk: that patients would say less about their symptoms if they thought they might go beyond the consulting room, so treatment would be less well informed. To mitigate that risk, some doctors stated publicly that they would not be participating in the scheme. Confidence fell resulting, according to 2022’s Goldacre Review, in “very large numbers of patients opting out of their records ever being shared outside of their GP practice (approximately three million by the end of 2021) with opt-outs now at a scale that will compromise the usefulness of the data” (p88). To put it another way, because of linked risks, the attempt to reduce the risk of insufficient research data actually made the research data risk worse. So the see-saw image was wrong on two counts: the risks to research and patient privacy weren’t actually opposed, but linked in a way that created feedback; and that feedback actually changed the environment so the seesaw couldn’t (easily) be returned to its original position.

Thinking instead of a spiral (OK, technically a helix) explains better what happened: the linked risks took the system around a loop (data => individual => doctor => data) but when, some months later, it returned to the original position things had deteriorated and the original situation could not be recovered.

But spirals can go both ways… Can we use linked risks to make a situation better? Regulators suggest a couple. According to the Information Commissioner (specifically referring to consent, but the point is general):

  • “Getting this right should be seen as essential to good customer service: it will put people at the centre of the relationship, and can help build confidence and trust. This can enhance your reputation, improve levels of engagement and encourage use of new services and products.”

But:

  • “Handling personal data badly … can erode trust in your organisation and damage your reputation. Individuals won’t want to engage with you if they think they cannot trust you with their data; you do things with it that they don’t understand, want or expect; or you make it difficult for them to control how it is used or shared”

Here are clear statements of the link between the risks of insufficient data for business and risk of privacy invasion for customers, and the possible spirals. The first is a spiral of improvement: if a business uses personal data in ways that also mitigate customers’ risks then those customers may be willing to volunteer more information which can then be used for mutual benefit. Now we have gone around the helix, but arrived at a better place for everyone. Similarly, but at sector or societal level, the European Commission’s Recitals to the NIS2 Directive suggest that appropriate sharing of information to improve the security of systems and data could increase confidence and make individuals more willing to transact through digital systems, with benefits for individuals, organisations, “economy and society”.

Considering whether risks might be mutually reinforcing (in either a positive or negative direction) rather than a trade-off, might help us find positive opportunities or, at least, highlight the risk of downward spirals before they do serious damage.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *