On May 13, the European Union’s Court of Justice ruled that citizens under its jurisdiction have a so-called “Right to be Forgotten,” essentially, that individuals have a right to leave embarrassing or adverse information from their past, in their past. On its face, the notion is appealing. Who wouldn’t want to be judged by the person he is today, as opposed to a younger, inexperienced, immature, and possibly reckless self? Starting over with a clean slate may not have been a civil right, but many took it for granted. For previous generations, mobility and the anonymity of mass society made it practical.
Information technology destroyed that. Digitized data can last, in theory, forever. The march of information technology has made it easier to collect, store, process, and recall that information. The warning not to put anything on-line that one doesn’t want the world to see flows from this technological reality. The benefits of those technologies (increased productivity and economic growth, customized applications, efficiency in the machines that make modern life possible, and access to a wider circle of information and people, indeed, the very ability to customize a growing segment of the world around us, etc.) has come at more than a monetary cost; they largely destroyed the anonymity of industrial-age society. That very anonymity undergirded many of our experiences and expectations about privacy, including the ability to leave the past in the past. Like King Canute ordering the tides to stop, the EU Court’s ruling seeks to reverse this trend, to undo the inevitable, to reverse the consequences of technology. (Canute, of course, intended to demonstrate his powerlessness in the face of nature.)
While the Court’s ruling is unlikely to reverse the tides of technology, it will have pernicious affects on the course of technological development just the same. The Court made search providers responsible for information cast into cyberspace by third parties, turning them into the Internet’s biggest censors. Proscribed search services will not prevent anyone from placing information in cyberspace, but they will prevent people from retrieving it from any affected services. The result may not be censorship in the strictest sense of the word, but the effect is very nearly the same.
Worse, the ruling will do nothing for privacy. Search and archiving technologies are old, simple, and relatively easy to reproduce. The case before the Court involved a subject’s complaint that an Internet search using Google’s service returned third party information about an embarrassing financial situation from decades back. In ruling that the individual involved had a “right” to prevent such information from appearing in search results, the court gave Google the obligation to change the way its software functions. Google can surely adjust its search filters for this individual so that such information is disregarded in the future, just as it already adjusts them to screen out information that is illegal in certain countries and reduce the amount of information unlikely to meet a user’s needs. Presumably, other search firms over which the EU can exert some authority will have to follow suit, if not on their own initiative, then following a request from the individual.
This process has the potential to repeat itself hundreds of millions of times, as citizens of EU countries begin petitioning search firms to disregard certain information about them. As the amount of information proliferates, it may well grow at a geometric rate, creating a decision-making morass that no firm or technology can replicate on a case-by-case basis. Instead, large firms will work to broaden their filters and apply them to an ever growing amount of material and entire classes of people. The search engines will, in effect, become less and less useful for finding any information.
That will not prevent anyone from seeing the offending material. First, citizens of the EU can still use search services from providers outside of the EU’s jurisdiction. As mainstream search engines are “dumbed down” to satisfy the ruling, foreign search services will become more popular. The Union may want to consider the likely location of some of these search firms, such as China or Russia.
Second, search technologies are relatively straight-forward, easy to reproduce, and quick to spread. Individuals, non-profits, and the general “hacker” community will likely develop new tools that EU citizens can use for improved search, much as they create and spread tools to help political dissidents in countries that practice heavy forms of censorship and online monitoring. Such technologies may not be as efficient, fast, or comprehensive as Google and its competitors, but they will likely exist beyond the reach of people who want to restrict information that is available about them on the internet.
Regrettably from a privacy standpoint, there is no “erase” button in cyberspace. The internet protocol that lies at the heart of moving information into, through, and around cyberspace is a routing technology. It is intended to create multiple pathways for any piece of information to move from point A to point B. Attempting to restrict one pathway, such as by modifying filters, may close that pathway, but it will not prevent information from routing around the “blockage.” Over time, those alternative pathways will become more popular, encouraging the creation of still more pathways. Just as this phenomenon fueled the growth of the internet, it will render the EU Court’s ruling moot. Thus, the same technologies that made it difficult for Iranian mullahs to prevent images of the 2009 uprisings from getting out, that make it possible for a school full of kidnapped Nigeria girls to generate worldwide attention and sympathy, and that enable sick individuals in developing countries to consult live with specialists anywhere in the world, are simultaneously making it impossible to hide embarrassing information from prying eyes.