Categories
Articles

Online Harms White Paper

The Government’s new White Paper on Online Harms is strikingly wide in both the range of harms identified, and the range of entities asked to play a part in reducing them. The White Paper envisages that harmful content could be spread through any online facility that allows individual users to share content, to find content shared by others, or interact with each other. The White Paper – recognising that this includes not just entities usually classed as social media platforms but also “retailers that allow users to review products online, along with non-profit organisations” – encourages a proportionate, risk-based approach to regulation. This will be essential, as many of the technical tools used by major social networks to block the uploading of unlawful or harmful material to their sites are unlikely to be available to the thousands of retailers whose  review pages might, in theory, be used as a venue for abuse.

Although universities and colleges may offer comment and public feedback pages, they are likely to have already assessed the risk of them being used for the main types of harm identified in the White Paper. Colleges’ existing safeguarding duties should already cover the risk of their online services being abused in ways harmful to young people; both universities and colleges should have considered the terrorism risk as part of their Prevent duties.

The White Paper envisages that the measures expected of organisations at different risk levels will be set out in Codes of Practice produced by the (yet to be appointed) Regulator. Given the relatively low attractiveness of university or college pages for disseminating harmful material, it would be surprising if these required more than is likely to be in place already: an effective route to flag inappropriate content, with post- or pre-moderation as a fallback option if a site were actually to become a target for misuse.

[UPDATE 16/4/19: The Government has just published an advisory Code of Practice for Social Media Platforms – as required by the Digital Economy Act 2017 – that suggests even less than this (moderation is not mentioned)]

By Andrew Cormack

I'm Chief Regulatory Advisor at Jisc, responsible for keeping an eye out for places where our ideas, services and products might raise regulatory issues. My aim is to fix either the product or service, or the regulation, before there's a painful bump!

Leave a Reply

Your email address will not be published. Required fields are marked *