I’ve sent in a Janet response to the EU’s consultation “A Clean and Open Internet: Procedures for notifying and acting on illegal content hosted by online intermediaries”. At the moment the E-Commerce Directive (transposed into UK law as the Electronic Commerce (EC Directive) Regulations 2002) says that websites aren’t liable for unlawful material (either criminal or civil) that is posted to their sites by third parties until they are either notified of alleged illegality or gain knowledge of it by other means (including their own investigations). Once they do know about the material they must remove it “expeditiously” to avoid liability thereafter.
This has been criticised both for discouraging sites from moderating or checking what is posted, and for encouraging them to remove material as soon as any complaint is received. The latter problem was highlighted as a human rights issue by the Law Commission in 2002, and I’ve recently discovered that the OFT pointed out that it could also be a consumer protection one (if reports of bad service are suppressed by legal threats) last year.
Within the limits of what is mostly a checkbox form for responding, I’ve tried to highlight those problems, particularly as they affect education organisations who may be expected (and sometimes required by law) both to proactively check content and to promote free speech. At the moment both of those are actually discouraged by liability law.
However in designing a better system it seems to me that there are two different kinds of illegality that may need to be dealt with separately. For one kind it’s actually impossible to tell from the posted content alone whether or not it is unlawful. For example content can’t be defamatory if it is true, and it can’t breach copyright if the poster has permission to post it. The website host can’t determine those from the information it has. For that sort of material I’ve suggested that the poster does need to have a “right of reply” to an allegation of law-breaking, whether that is established by a right to have material put back after take down (as in US law for copyright) or by requiring the poster to be contacted before material is removed (as seems to be the idea for the UK’s new Defamation Bill). For the other kind of material – which includes malware and indecent images of children – it is clear from the material itself that it is unlawful, so a right of reply would simply delay the process of removing something that is plainly unlawful to publish or distribute.
The consultation also asks about systems for reporting problems to websites. From the responses I’ve had from members of the Janet community it seems that most websites do react quickly when told of problems (given the legal position it would be odd if they didn’t), but that it can be difficult to find where to send reports. I’ve therefore agreed with the Commission’s suggestion that reporting mechanisms should be made more obvious, but pointed out that these may need to be suitable both for a human to enter a single problem report and for an automated system to report a batch of problems, such as a range of phishing sites. The consultation suggests that sites that provide reporting interfaces should only have to respond to reports sent through them – I’ve suggested that at least a report by another channel shouldn’t trigger loss of liability protection.