Robin Wilton of the Internet Society gave a talk at the TERENA Networking Conference on the interaction between privacy, regulation, and innovation. It’s a commonly heard claim that regulation stifles innovation; yet the evidence of premium rate phone fraud and other more or less criminal activities suggests that regulation can, in fact, stimulate innovation, though not always of the type we want. So perhaps our focus, rather than resisting regulation, should be on devising regulation that promotes socially beneficial, rather than socially harmful, innovation.
It is generally considered that markets are more flexible and efficient than regulation, so regulation should only be used where there is a social need that the market does not give sufficient weight. That does seem to be the case for privacy, where society’s interest in the protection of individuals is greater than the value that the market assigns to it. Saying that “personal data is the new oil” suggests, perhaps unintentionally, both the high economic value of exploiting it and the need for that exploitation to be regulated to avoid serious harm. At the moment, innovation in the intrusive use of personal data seems to be taking place more quickly and more widely than innovation in protecting it.
Unfortunately regulating privacy, particularly on the Internet, has turned out to be hard. Robin suggested two principles that should make this easier: regulate risks rather than threats, and behaviour rather than technology.
It’s very tempting, and good rhetoric, to look for particular threats to privacy that we want to be regulated. Unfortunately the fact that regulation can stimulate anti-regulatory innovation means that there will be an ever-growing list of threats in any one area, so this approach ensures a regulatory arms race which probably will frustrate beneficial innovation. Rhetoric also naturally leads to an antagonistic approach where positions diverge and I can only ‘win’ if you ‘lose’.
Technology is also a tempting target for regulation, but regulators should be aware of the programmers’ mantra: there’s more than one way to do it. Unless it is actually the technology that might cause harm (high voltages, radiation, etc.) then someone with sufficient economic motivation can almost certainly find a different, unregulated, technological way to achieve their goal, causing at least the same harm in the process.
The EU’s attempt to regulate cookies provides a striking illustration of both problems. Rather than looking at a particular risk arising out of Internet profiling and regulating that, the law decided to target one specific technological approach to profiling and to regulate all threats that happened to use that particular technology. Regulators have since recognised that many of those threats were actually negligible and some were significant benefits to users, but cannot change the law. Anyone trying to innovate with cookies is now faced with confusing regulatory advice; meanwhile those involved in profiling are free to move to other technologies such as super-cookies and browser fingerprinting that are harder for the user to control and may well be even more intrusive.
Innovation needs experiment and a willingness to do things differently. According to Julie Cohen “privacy … shelters the processes of play and experimentation from which innovation emerges”. Privacy needs to be protected by regulation, but it must be the right sort of regulation.