Categories
Articles

Online Safety Bill: effect on small services

Over the past few months there has been a lot of discussion of the impact of the Government’s Online Safety Bill on large providers. Ofcom’s July 2022 Implementation Roadmap (p5) estimates that there are 30-40 of those, to be covered by Categories 1, 2a and 2b. However the roadmap mentions a further 25000 UK services that will be in scope of the Bill: “Broadly speaking, services where users may encounter content (such as messages, images, videos and comments) that has been generated, uploaded or shared by other users will be in scope of the online safety regime” (p11). There are some exemptions in the Bill but, for example, none of those seem to apply to the comment feature on this blog. What might the Bill require here?

Although the Bill has been subject to considerable change, two types of content have been a consistent focus: “illegal” and “harmful to children”. In each case it’s envisaged that there will be a list of specific kinds of harm: service operators will need to first assess the risk of each kind of material appearing, then apply appropriate safeguards to that risk. Whether the children list needs to be considered depends on each service’s assessment of “whether children are likely to access their service or part of their service” (p16). The categories considered “harmful to children” will be defined in a future statutory instrument; those considered “illegal” are currently in Schedules 5 (Terrorism), 6 (Child Sexual Exploitation and Abuse) and 7 (Assisting Suicide, Threats to Kill, Public Order Offences, Drugs and Psychoactive Substances, Firearms and Other Weapons, Illegal Immigration, Sexual Exploitation, Sexual Images, Proceeds of Crime, Fraud, Financial Services crimes) though this may change.

All services will need to implement processes to receive reports of content in their relevant categories, to take down (at least) illegal content, and to deal with complaints about these processes (p14). There will also be duties on “review and record keeping” (p13) including – according to clause 29 of the current Bill – “every risk assessment”, “any measures taken” and regular compliance reviews.

For small sites, the amount of work will depend heavily on the required risk assessments and safeguards. The Bill seems to require that these are done separately for each kind of harm (current clause 8(5)(b)(i)), but details of how to assess and what protection is required are left to Ofcom. For illegal content, their Roadmap suggests:

“This must assess, amongst other things, the risk of individuals encountering illegal content on a service, the risk of harm presented by illegal content and how the operations and functionalities of a service may reduce or increase these risks” (p14)

and

“All services will need to put in place proportionate measures to effectively mitigate and manage the risks of harm from illegal content.” (p14)

There are similar requirements for the “harmful to children” categories.

A lot will depend on those words “proportionate” and “effectively”. Will it be sufficient, for example, to say that all comments to this site are already checked and approved by humans before they are published? I can’t think what we could do that would further reduce any (I hope, low) risk of encountering illegal or harmful content. Ofcom do note that large services have “capabilities and resources that vastly outstrip those of most in-scope services” (p8) and “each service’s chosen approach should reflect its characteristics and the risks it faces” (p5). But the Bill applies the same risk management framework to everyone, so their flexibility may be limited.

The Bill was significantly changed in December 2022, and Ofcom’s Roadmap refers to an earlier version. I have concentrated here on areas which were not affected. However the Bill is yet to go to the House of Lords (expected Jan/Feb 2023) and both government and opposition have declared their intention to make further changes there. Other obligations may appear or disappear. But if it is to become law, the Bill needs to be agreed before the summer. Ofcom’s powers will commence two months after that happens, and the Roadmap envisages a consultation on draft guidance on illegal content shortly thereafter, with a final version a year later (p7). Categories harmful to children need to be defined in further legislation, so that guidance is likely to appear later following a similar process.

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *