Prevent: What’s The Role of Technology?

Roughly what I said in my Digifest presentation yesterday

Since the Prevent duty, to help those at risk of radicalisation, was applied to universities and colleges there has been a lot of discussion of what role technology can play. The first thing to note is that, although there is a section on “IT Policies” in the Home Office Guidance, it’s only two paragraphs out of four pages. The rest covers the policies and processes that organisations should use to identify and help those at risk of being drawn into committing criminal acts, in particular planning or committing terrorist acts themselves or inciting others to do so.

Even those two “IT” paragraphs mostly cover policy and process: defining acceptable use and providing support for legitimate research. That leaves just a single sentence:

Many educational institutions already use filtering as a means of restricting access to harmful content, and should consider the use of filters as part of their overall strategy to prevent people being drawn into terrorism

On closer inspection, that seems odd. It’s talking about institutions that already use content filtering for other purposes, so have already bought the technology and implemented the policies and processes. But it still only says they should “consider” adding radicalisation to the list of harms to which the technology is applied. Why isn’t that obviously the right thing to do?

Here it’s important to remember the state of mind that Prevent is meant to address. The early stages of radicalisation usually involve a person feeling that an injustice has been done to some group they care about – “us” – and that “they” aren’t doing enough to resolve it. By the time this grievance has progressed to the stage of unlawful violent action it’s too late for Prevent: dealing with crimes is the job of the police and security services.

So how will someone with a grievance react if, when they try to find out more about their cause, they instead get a “prohibited content” banner? Some may go back to their studies, but others may conclude that the university or college has joined the conspiracy of “them”, and either do their researches on another network or use simple technical tools to conceal their activities both from the blocking system and from the organisation’s other logs. Now we have someone on campus who is one step closer to radicalised but whose activities no longer leave traces in our systems. HEFCE’s Prevent Monitoring Framework expects organisations to assess risks and take steps to mitigate them: the risk that inappropriate use of technical tools might make the problem worse should be something we consider.

Indeed HEFCE’s Advice note encourages educational organisations to consider “whether and how” to use filtering technologies. Questions that might feature in that consideration include:

  • Is technology more effective as a way to prevent radicalisation, or a source of signs when it might be occurring?
  • What information do we have on how to use technology for either of those purposes? As with signs of radicalisation in the physical world, patterns of behaviour seem more likely indicators than any single act.
  • Where might our digital systems already have relevant information? Might significant changes of behaviour show up in e-mail or DNS logs, or even just times of logins?
  • How are our users likely to react? Does the existing organisational culture view technical systems as protecting or threatening individuals’ interests?
  • Perhaps most important, how do we keep those at risk within the systems we have created to help and guide them, rather than driving them away? All our attempts to support individuals could be badly undermined if our technical actions mean they view the organisation and its systems as a threat rather than a trusted helper.

Ultimately, Prevent is about changing minds: helping individuals understand the difference between constructive and destructive ways of raising and addressing concerns. Technology alone is very unlikely to be able to do that, indeed it probably won’t even accurately identify everyone who needs that help. Only people can achieve the changes that Prevent tries to deliver: processes and systems must support them. Technology may have a small part to play in that but, as the guidance says, it must be part of a consistent overall strategy.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published.