In 1995, the European Council passed Directive 95/46 on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data (“the Directive”).1 Proposed in 1990, when the Internet did not yet exist in its modern form,2 and passed three years before Google was founded,3 the Directive was intended to regulate and supervise data controllers4 and ensure that data-processing systems “protect the fundamental rights and freedoms of natural persons, and in particular their right to privacy.”5 Recently, in Google Spain SL v. Agencia Española de Protección de Datos,6 the Court of Justice of the European Union (CJEU) interpreted the Directive as creating a presumption that Google must delete links to personal information from search results at the request of a data subject7 unless a strong public interest suggests otherwise. Many American and European analysts have attacked the decision as a mistaken legal interpretation of the Directive that gave too much power to private entities to control public information access. These critiques raise valid concerns, yet the suggestion that it is the court’s interpretation that is at fault misses the point; the legal interpretation was a reasonable reflection of the text of the Directive and the values embodied in it. Critics seeking meaningful change should thus use the decision and the conversation it has generated to shape the debate on what values should be represented — and how — in a new regulatory regime.
On March 5, 2010, Mario Costeja González, a Spanish citizen, lodged a complaint with the Spanish data protection agency, AEPD,8 against a Spanish newspaper, Google Spain SL (“Google Spain”), and Google Inc.9 An Internet user typing Costeja González’s name into Google’s search engine would receive links to two newspaper pages announcing a foreclosure auction on Costeja González’s home.10 In his complaint, Costeja González requested first that the newspaper be required to remove his name, and second that Google Spain “remove or conceal” his personal data so that they no longer appeared in the search results.11 Costeja González argued that because the attachment proceedings had been “fully resolved[,] . . . reference to them was now entirely irrelevant,”12 and he had the right to have the data removed.13
The AEPD denied Costeja González’s complaint against the newspaper, but granted it against Google. The newspaper had no obligation to remove the announcements, as they had been lawfully published.14 However, the agency reasoned that search engine operators were data controllers, that they were thus subject to the Directive, and that Google Spain and Google Inc. were therefore required to remove links to data upon request by the data subject.15 Google Spain and Google Inc. both appealed to the Spanish high court, which referred several sets of questions to the CJEU for a preliminary ruling concerning the proper interpretation of the Directive.16 The first set of questions17 addressed whether Google should be classified as a data controller (a requirement for being subject to the Directive) and whether Google, as a non–European Union company, was subject to the Directive’s territorial reach. If the court answered both questions affirmatively, it was asked to then determine the scope of Google’s legal responsibility as a data controller and whether a citizen had the right to have Google erase his data — in other words, the scope of the “right to be forgotten.”18
The CJEU’s preliminary ruling was consistent with the AEPD’s interpretation of the Directive. In examining whether Google was a data controller subject to the Directive, the court determined that a search engine’s activities constitute data processing19 because a search engine “‘collects’ such data which it subsequently ‘retrieves[,]’ ‘records[,]’ . . . ‘organi[z]es[,]’ . . . ‘stores’ on its servers[,] and [then] . . . ‘discloses,’”20 and because the data clearly include personal data.21 Given that a search engine operator “determines the purposes and means” of the data processing, a search engine operator should also be regarded as a data controller.22 As a data controller, a search engine operator must comply with the Directive.23
The court then determined that Google Inc.’s presence in Spain was sufficient to subject it to the Directive. Though all of Google Inc.’s data processing occurred outside Spain, Google Spain sold advertising space within the country; since advertising is Google Inc.’s main source of revenue, the court held that the two entities were “closely linked.”24 Google Spain was thus effectively an establishment of Google Inc., making Google Inc. subject to the Directive.25
Having resolved the threshold issues, the court turned to the next inquiry: what were search engine operators’ legal obligations under the Directive? The court noted that the Directive required a balancing test26: while personal data processing was permitted when it was necessary to serve the controller’s or third parties’ legitimate interests, it was not permitted “where such interests are overridden by the interests or fundamental rights and freedoms of the data subject — in particular his right to privacy.”27 Given the “seriousness of [the] interference” with a data subject’s rights, an operator’s economic interests were never sufficient to justify interference with privacy rights;28 moreover, privacy rights “override, as a rule . . . the interest of the general public” in having access to private information.29 This presumption could be overcome only “by the preponderant interest of the general public in having . . . access to the information.”30
Moreover, the court understood the data subject’s privacy interest to be so important that the subject could successfully object even if the data were in no way prejudicial. Instead, a data subject may legitimately object if information is “inadequate, irrelevant or no longer relevant, or excessive in relation to [the] purposes [of the processing] and in the light of the time that has elapsed.”31 If that is the case, then a search engine operator must remove the links.32
In the wake of the CJEU’s decision, there has been much criticism unified around one belief: the court got it wrong. Many of the attacks on the decision have been explicitly legal: a number of critics argue that the court incorrectly found Google a data controller subject to the Directive and that the court’s balancing test ignored basic legal principles and rights. Other critics have focused more on the opinion’s consequences, arguing that the decision transferred too much power to private entities to censor the Internet without providing sufficient implementation guidance. But critics ignore that the decision was a reasonable interpretation of the Directive’s text and the deeply held privacy values manifested therein. Critics seeking meaningful change should instead use the decision and the ensuing debate to shape the conversation on a new regulatory regime tailored to the nuances of modern privacy protection and reflective of the values these critics seem to believe are currently underrepresented.
The first line of legal criticism attacks the court’s overly broad interpretation of “data controllers” as including search engine operators. The British House of Lords report reviewing the decision and its effects lamented that the court’s definition of a data controller was now so broad that it could include “any company that aggregates publicly available data”33 and concluded that the court “could and should have interpreted the Directive much more stringently.”34 The report argued that the court’s decision led to absurd results: “If search engines are data controllers, so logically are users of search engines.”35
The second line of legal criticism challenges the court’s balancing test, which prioritizes privacy rights over nearly all other rights. Critics claim that by creating a presumption toward data erasure, the court created a “super-human[ ]right,”36 even though “there is no hierarchical relationship between the conflicting human rights.”37 By focusing so much on the right to privacy, the CJEU “forgot that other rights [were] also applicable,”38 including freedom of information.39
But the court’s interpretation is firmly grounded in the text of the Directive and its underlying values. Google’s own description of how Internet search works — crawling the web, sorting and indexing the results, running algorithms to determine what to show, and displaying the final results40 — neatly mirrors both the legal and intuitive definitions of a data processor and controller. Even the highly critical House of Lords report acknowledged that many of their expert witnesses believed the court had correctly classified search engine operators based on the Directive’s language.41
The criticism of the court’s balancing test also ignores that the lopsided rights prioritization stems from the Directive itself. Although the Directive does acknowledge the importance of the free flow of data to the economy,42 that acknowledgement is immediately subordinated by the first article of the Directive, which describes its object as “protect[ing] the fundamental rights and freedoms of natural persons, and in particular their right to privacy.”43 The court’s text-based interpretation was reasonable and reflected the Directive’s underlying values.
Critics also level two significant primarily consequentialist attacks against the opinion and its real-world effects. First, critics claim the opinion grants too much power to individuals and Google to censor public materials without oversight. By submitting a form, individuals may effectively “impede access to facts about themselves”44 simply because they would prefer that information no longer be “easily available.”45 Second, critics argue these requests should not be considered or decided upon entirely inside a private corporation, without public accountability or scrutiny.46 This is particularly concerning for critics who believe the court’s decision provides little guidance to Google and insufficient protections for the public interests in freedom of expression or information.47
While these consequentialist criticisms reflect valid concerns, they too miss the clear tie between the court’s decision and the Directive’s text and values. For example, the decision’s empowerment of an individual to control the use of his personal data was derived from the prioritization of individual privacy rights in the Directive itself. Similarly, the opinion’s apparent grant of power to Google to decide what information appears in the search results48 traces directly to the Directive’s command that “[i]t shall be for the controller to ensure that [the principles relating to data quality are] complied with.”49 And finally, the lack of guidance on implementation is also a result of the Directive’s broad and vague language.50 While the court could have taken a more active role in providing guidance, it exercised reasonable judicial restraint in allowing Google to craft the parameters of a workable test on its own.
Critics’ failure to fully engage with the privacy values underpinning the Directive and the decision has hindered their full participation in the policy debate. The Council of the European Union is currently considering a new Data Protection Regulation51 that not only enshrines but also expands the rights articulated in the CJEU’s opinion.52 By attacking the opinion without fully acknowledging the underlying values the Directive and the Regulation promote, critics are losing the debate to the privacy advocates: the proposed measure enjoys widespread support across much of Europe and has already passed in the European Parliament.53 The real debate is not on what has already been decided, but on what is yet to come; if critics hope to change the substantive outcome, they must shift their focus from secondary legal and policy arguments to the fundamental values at stake.