Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Categories
Articles

Consent and the Role of the Regulator

Reading yet another paper on privacy and big data that concluded that processing should be based on the individual’s consent, it occurred to me how much that approach limits the scope and powers of privacy regulators. When using consent to justify processing, pretty much the only question for regulators is whether the consent was fairly obtained – effectively they are reduced to just commenting and ruling on privacy notices. And, indeed, a surprising number of recent opinions and cases do seem to be about physical and digital signage.

But in an area as complicated as big data, where both the potential risks and benefits to individuals and society are huge, I’d like privacy regulators to be doing more than that. It seems pretty clear that there will be some possible uses of big data that should be prohibited – no matter how persuasive the privacy notice – as harmful to individuals and society. Conversely there are other uses where the benefits to both should legitimise them without everyone having to agree individually. Privacy regulators ought, I think, to be playing a key role in those decisions, something that invoking “consent” prevents them from doing.

There is an existing legal provision that would let regulators discuss much meatier questions: whether processing is “necessary for a legitimate interest” and whether that interest is “overridden by the fundamental rights of the individual”; however until recently it hasn’t been much used. The Article 29 Working Party’s Opinion on Legitimate Interests is a promising start, but it would be good to see regulators routinely discussing new types of processing in those terms. Looking at big data, and other technologies with complex privacy effects, explicitly in terms of the benefits they might provide and the harms they might cause – maximising the former and minimising the latter – seems a much better way to protect privacy than simply handing the question to individuals and then considering, after it is too late, whether or not their consent was fairly obtained.

Analysing applications in terms of legitimate interests and personal rights could even benefit those organisations that want to do the right thing. A business that can demonstrate, in terms approved by a privacy regulator, how its activities provide a significant benefit without threatening the fundamental rights of its customers would seem to have a strong ethical and legal position: at least as good as one claiming “those consequences were clear from our privacy policy that you consented to”. An interesting survey of trust in different public sector organisations suggests this may be a calculation we are already making instinctively. And if this approach were to become the norm then it might even provide a signal of its own – that a proposition that doesn’t make the legitimate interest/fundamental rights case, but relies instead on user consent, should be examined very closely by those users.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *