Internet & Communications Law Recent Case 128 Harv. L. Rev. 735

Google Spain SL v. Agencia Española de Protección de Datos

Court of Justice of the European Union Creates Presumption that Google Must Remove Links to Personal Data upon Request.

Comment on: Case C-131/12 (May 13, 2014)


Download

In 1995, the European Council passed Directive 95/46 on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data (“the Directive”).1 Proposed in 1990, when the Internet did not yet exist in its modern form,2 and passed three years before Google was founded,3 the Directive was intended to regulate and supervise data controllers4 and ensure that data-processing systems “protect the fundamental rights and freedoms of natural persons, and in particular their right to privacy.”5 Recently, in Google Spain SL v. Agencia Española de Protección de Datos,6 the Court of Justice of the European Union (CJEU) interpreted the Directive as creating a presumption that Google must delete links to personal information from search results at the request of a data subject7 unless a strong public interest suggests otherwise. Many American and European analysts have attacked the decision as a mistaken legal interpretation of the Directive that gave too much power to private entities to control public information access. These critiques raise valid concerns, yet the suggestion that it is the court’s interpretation that is at fault misses the point; the legal interpretation was a reasonable reflection of the text of the Directive and the values embodied in it. Critics seeking meaningful change should thus use the decision and the conversation it has generated to shape the debate on what values should be represented — and how — in a new regulatory regime.

On March 5, 2010, Mario Costeja González, a Spanish citizen, lodged a complaint with the Spanish data protection agency, AEPD,8 against a Spanish newspaper, Google Spain SL (“Google Spain”), and Google Inc.9 An Internet user typing Costeja González’s name into Google’s search engine would receive links to two newspaper pages announcing a foreclosure auction on Costeja González’s home.10 In his complaint, Costeja González requested first that the newspaper be required to remove his name, and second that Google Spain “remove or conceal” his personal data so that they no longer appeared in the search results.11 Costeja González argued that because the attachment proceedings had been “fully resolved[,] . . . reference to them was now entirely irrelevant,”12 and he had the right to have the data removed.13

The AEPD denied Costeja González’s complaint against the newspaper, but granted it against Google. The newspaper had no obligation to remove the announcements, as they had been lawfully published.14 However, the agency reasoned that search engine operators were data controllers, that they were thus subject to the Directive, and that Google Spain and Google Inc. were therefore required to remove links to data upon request by the data subject.15 Google Spain and Google Inc. both appealed to the Spanish high court, which referred several sets of questions to the CJEU for a preliminary ruling concerning the proper interpretation of the Directive.16 The first set of questions17 addressed whether Google should be classified as a data controller (a requirement for being subject to the Directive) and whether Google, as a non–European Union company, was subject to the Directive’s territorial reach. If the court answered both questions affirmatively, it was asked to then determine the scope of Google’s legal responsibility as a data controller and whether a citizen had the right to have Google erase his data — in other words, the scope of the “right to be forgotten.”18

The CJEU’s preliminary ruling was consistent with the AEPD’s interpretation of the Directive. In examining whether Google was a data controller subject to the Directive, the court determined that a search engine’s activities constitute data processing19 because a search engine “‘collects’ such data which it subsequently ‘retrieves[,]’ ‘records[,]’ . . . ‘organi[z]es[,]’ . . . ‘stores’ on its servers[,] and [then] . . . ‘discloses,’”20 and because the data clearly include personal data.21 Given that a search engine operator “determines the purposes and means” of the data processing, a search engine operator should also be regarded as a data controller.22 As a data controller, a search engine operator must comply with the Directive.23

The court then determined that Google Inc.’s presence in Spain was sufficient to subject it to the Directive. Though all of Google Inc.’s data processing occurred outside Spain, Google Spain sold advertising space within the country; since advertising is Google Inc.’s main source of revenue, the court held that the two entities were “closely linked.”24 Google Spain was thus effectively an establishment of Google Inc., making Google Inc. subject to the Directive.25

Having resolved the threshold issues, the court turned to the next inquiry: what were search engine operators’ legal obligations under the Directive? The court noted that the Directive required a balancing test26: while personal data processing was permitted when it was necessary to serve the controller’s or third parties’ legitimate interests, it was not permitted “where such interests are overridden by the interests or fundamental rights and freedoms of the data subject — in particular his right to privacy.”27 Given the “seriousness of [the] interference” with a data subject’s rights, an operator’s economic interests were never sufficient to justify interference with privacy rights;28 moreover, privacy rights “override, as a rule . . . the interest of the general public” in having access to private information.29 This presumption could be overcome only “by the preponderant interest of the general public in having . . . access to the information.”30

Moreover, the court understood the data subject’s privacy interest to be so important that the subject could successfully object even if the data were in no way prejudicial. Instead, a data subject may legitimately object if information is “inadequate, irrelevant or no longer relevant, or excessive in relation to [the] purposes [of the processing] and in the light of the time that has elapsed.”31 If that is the case, then a search engine operator must remove the links.32

In the wake of the CJEU’s decision, there has been much criticism unified around one belief: the court got it wrong. Many of the attacks on the decision have been explicitly legal: a number of critics argue that the court incorrectly found Google a data controller subject to the Directive and that the court’s balancing test ignored basic legal principles and rights. Other critics have focused more on the opinion’s consequences, arguing that the decision transferred too much power to private entities to censor the Internet without providing sufficient implementation guidance. But critics ignore that the decision was a reasonable interpretation of the Directive’s text and the deeply held privacy values manifested therein. Critics seeking meaningful change should instead use the decision and the ensuing debate to shape the conversation on a new regulatory regime tailored to the nuances of modern privacy protection and reflective of the values these critics seem to believe are currently underrepresented.

The first line of legal criticism attacks the court’s overly broad interpretation of “data controllers” as including search engine operators. The British House of Lords report reviewing the decision and its effects lamented that the court’s definition of a data controller was now so broad that it could include “any company that aggregates publicly available data”33 and concluded that the court “could and should have interpreted the Directive much more stringently.”34 The report argued that the court’s decision led to absurd results: “If search engines are data controllers, so logically are users of search engines.”35

The second line of legal criticism challenges the court’s balancing test, which prioritizes privacy rights over nearly all other rights. Critics claim that by creating a presumption toward data erasure, the court created a “super-human[ ]right,”36 even though “there is no hierarchical relationship between the conflicting human rights.”37 By focusing so much on the right to privacy, the CJEU “forgot that other rights [were] also applicable,”38 including freedom of information.39

But the court’s interpretation is firmly grounded in the text of the Directive and its underlying values. Google’s own description of how Internet search works — crawling the web, sorting and indexing the results, running algorithms to determine what to show, and displaying the final results40 — neatly mirrors both the legal and intuitive definitions of a data processor and controller. Even the highly critical House of Lords report acknowledged that many of their expert witnesses believed the court had correctly classified search engine operators based on the Directive’s language.41

The criticism of the court’s balancing test also ignores that the lopsided rights prioritization stems from the Directive itself. Although the Directive does acknowledge the importance of the free flow of data to the economy,42 that acknowledgement is immediately subordinated by the first article of the Directive, which describes its object as “protect[ing] the fundamental rights and freedoms of natural persons, and in particular their right to privacy.”43 The court’s text-based interpretation was reasonable and reflected the Directive’s underlying values.

Critics also level two significant primarily consequentialist attacks against the opinion and its real-world effects. First, critics claim the opinion grants too much power to individuals and Google to censor public materials without oversight. By submitting a form, individuals may effectively “impede access to facts about themselves”44 simply because they would prefer that information no longer be “easily available.”45 Second, critics argue these requests should not be considered or decided upon entirely inside a private corporation, without public accountability or scrutiny.46 This is particularly concerning for critics who believe the court’s decision provides little guidance to Google and insufficient protections for the public interests in freedom of expression or information.47

While these consequentialist criticisms reflect valid concerns, they too miss the clear tie between the court’s decision and the Directive’s text and values. For example, the decision’s empowerment of an individual to control the use of his personal data was derived from the prioritization of individual privacy rights in the Directive itself. Similarly, the opinion’s apparent grant of power to Google to decide what information appears in the search results48 traces directly to the Directive’s command that “[i]t shall be for the controller to ensure that [the principles relating to data quality are] complied with.”49 And finally, the lack of guidance on implementation is also a result of the Directive’s broad and vague language.50 While the court could have taken a more active role in providing guidance, it exercised reasonable judicial restraint in allowing Google to craft the parameters of a workable test on its own.

Critics’ failure to fully engage with the privacy values underpinning the Directive and the decision has hindered their full participation in the policy debate. The Council of the European Union is currently considering a new Data Protection Regulation51 that not only enshrines but also expands the rights articulated in the CJEU’s opinion.52 By attacking the opinion without fully acknowledging the underlying values the Directive and the Regulation promote, critics are losing the debate to the privacy advocates: the proposed measure enjoys widespread support across much of Europe and has already passed in the European Parliament.53 The real debate is not on what has already been decided, but on what is yet to come; if critics hope to change the substantive outcome, they must shift their focus from secondary legal and policy arguments to the fundamental values at stake.

Footnotes
  1. ^ Council Directive 95/46, 1995 O.J. (L 281) 31 (EC).

    Return to citation ^
  2. ^ Opinion of Advocate General Jääskinen ¶ 10, Google Spain SL v. Agencia Española de Protección de Datos (May 13, 2014) (Case C-131/12), http://curia.europa.eu/juris/document/document.jsf?text=&docid=138782&doclang=EN [http://perma.cc/Y7C5-65WB].

    Return to citation ^
  3. ^ Our History in Depth, Google, http://www.google.com/about/company/history (last visited Oct. 26, 2014) [http://perma.cc/SK8Q-3K22].

    Return to citation ^
  4. ^ The Directive defines a data controller as any entity “which alone or jointly with others determines the purposes and means of the processing of personal data.” Council Directive 95/46, supra note 1, art. 2(d), at 38.

    Return to citation ^
  5. ^ Id. art. 1(1), at 38.

    Return to citation ^
  6. ^ Case C-131/12, Google Spain SL v. Agencia Española de Protección de Datos (May 13, 2014), http://curia.europa.eu/juris/document/document.jsf?text=&docid=152065&doclang=EN [http://perma.cc/ED5L-DZRK].

    Return to citation ^
  7. ^ A data subject is the person to whom the data relate. See Council Directive 95/46, supra note 1, art. 2(a), at 38.

    Return to citation ^
  8. ^ In 1999, Spain incorporated the provisions and protections of Directive 95/46 into national legislation and established a national agency to implement the Directive and handle complaints. See Ley de Protección de Datos de Carácter Personal (B.O.E. 1999, 298); see also Council Directive 95/46, supra note 1, art. 28, at 47–48; Privacy Int’l, Spain ch. I (2011), https://www.privacyinternational.org:4443/reports/spain/i-legal-framework [http://perma.cc/6DSD-9CJG].

    Return to citation ^
  9. ^ Google Spain SL, Case C-131/12, ¶ 14. Google Spain is a Spanish subsidiary of Google that acts as Google’s “commercial representative . . . for its advertising functions.” Opinion of Advocate General Jääskinen, supra note 2, ¶ 62. Google Spain does not process data as part of Google’s search engine function, Google Spain SL, Case C-131/12, ¶ 46, so when it received Costeja González’s original takedown request, the company forwarded it to Google Inc. as the provider of the search engine service, Opinion of Advocate General Jääskinen, supra note 2, ¶ 20.

    Return to citation ^
  10. ^ Google Spain SL, Case C-131/12, ¶ 14.

    Return to citation ^
  11. ^ Id. ¶ 15.

    Return to citation ^
  12. ^ Id. The Directive provides that “personal data must be . . . adequate, relevant and not excessive in relation to the purposes for which they are collected.” Council Directive 95/46, supra note 1, art. 6, at 40.

    Return to citation ^
  13. ^ See Council Directive 95/46, supra note 1, art. 12, at 42 (“Member States shall guarantee every data subject the right to obtain from the [data] controller . . . the rectification, erasure or blocking of data the processing of which does not comply with the provisions of this Directive . . . .”).

    Return to citation ^
  14. ^ Google Spain SL, Case C-131/12, ¶ 16. Indeed, the paper had been ordered by the government to publish the announcements to publicize the auction. Id.

    Return to citation ^
  15. ^ Id. ¶ 17.

    Return to citation ^
  16. ^ Id. ¶¶ 18–20. National courts may refer questions on the interpretation of European Union law to the CJEU for guidance. See Ralph H. Folsom, Principles of European Union Law 87–88 (4th ed. 2005). The CJEU’s decisions, called preliminary rulings, are binding in the case referred and at least persuasive in other nations. Id. at 88–89.

    Return to citation ^
  17. ^ The Spanish court referred these questions in three different groups; the order reflected here is the order in which the CJEU answered.

    Return to citation ^
  18. ^ Google Spain SL, Case C-131/12, ¶ 20. The phrase “right to be forgotten” has many slightly different uses but generally describes the idea that a person should have the right to escape irrelevant aspects of his past. See generally European Commission, Factsheet on the “Right to be Forgotten” Ruling 2 (2014), http://ec.europa.eu/justice/data-protection/files/factsheets/factsheet_data_protection_en.pdf [http://perma.cc/J5LC-K3DC].

    Return to citation ^
  19. ^ The Directive defines data processing as “any operation or set of operations which is performed upon personal data, whether or not by automatic means, such as collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction.” Council Directive 95/46, supra note 1, art. 2(b), at 38 (emphasis added).

    Return to citation ^
  20. ^ Google Spain SL, Case C-131/12, ¶ 28.

    Return to citation ^
  21. ^ Id. ¶ 27. Personal data is defined in the Directive as “any information relating to an identified or identifiable natural person (‘data subject’).” Council Directive 95/46, supra note 1, art. 2(a), at 38.

    Return to citation ^
  22. ^ Google Spain SL, Case C-131/12, ¶ 33. The court stated that given the significant role search engines play in modern life, finding otherwise would be contrary not only to the “clear wording” of the Directive, but also to the Directive’s objective of ensuring “effective and complete protection of data subjects.” Id. ¶ 34. The court noted “the important role played by the internet and search engines in modern society,” which magnified the interference with an individual’s rights to privacy and personal data protection by providing a “structured overview” of “vast” amounts of personal information that otherwise “could not have been interconnected.” Id. ¶ 80.

    Return to citation ^
  23. ^ Id. ¶ 38.

    Return to citation ^
  24. ^ Id. ¶ 46.

    Return to citation ^
  25. ^ Id. ¶ 60.

    Return to citation ^
  26. ^ See Council Directive 95/46, supra note 1, art. 7, at 40 (establishing the criteria for legitimate data processing).

    Return to citation ^
  27. ^ Google Spain SL, Case C-131/12, ¶ 74. For the source of these fundamental rights, see Charter of Fundamental Rights of the European Union, art. 8, 2000 O.J. (C 364) 1, 10.

    Return to citation ^
  28. ^ Google Spain SL, Case C-131/12, ¶ 81.

    Return to citation ^
  29. ^ Id. ¶ 99 (emphasis added).

    Return to citation ^
  30. ^ Id. (emphasis added). This might be the case, for example, if the data subject played a significant role in public life. See id. ¶ 81. The court also noted that search engine operators may be required to remove links to data even when the original publisher does not have the same obligation. Id. ¶ 88. Because a search engine “heighten[s]” the level of interference with the right to privacy, id. ¶ 80, the balancing test may come out differently for search engines than for original publishers, id. ¶ 86. Moreover, some original publishers will not be subject to the Directive’s jurisdiction, thus compromising privacy protection if data subjects were required to simultaneously ensure the erasure of material from the primary source. Id. ¶ 84.

    Return to citation ^
  31. ^ Id. ¶ 93. Directive Articles 14(a) and 12(b) provide the authority for objection and removal, while Article 6(1)(c) establishes the relevant substantive conditions. See Council Directive 95/46, supra note 1.

    Return to citation ^
  32. ^ Google Spain SL, Case C-131/12, ¶ 94.

    Return to citation ^
  33. ^ European Union Committee, EU Data Protection Law: A ‘Right to Be Forgotten’?, 2014-5, H.L. 40, ¶ 40 (U.K.) [hereinafter H.L. Committee Report] (quoting statement from Morrison & Foerster).

    Return to citation ^
  34. ^ Id. ¶ 55 (quoting Professor Luciano Floridi, who also stated to the committee that the court could have “conclud[ed] that a link to some legally available information does not process the information in question”); see also Danny O’Brien & Jillian York, Rights that Are Being Forgotten: Google, the ECJ, and Free Expression, Electronic Frontier Found. (July 8, 2014), https://www.eff.org/deeplinks/2014/07/rights-are-being-forgotten-google-ecj-and-free-expression [http://perma.cc/BR4Z-C2SH].

    Return to citation ^
  35. ^ H.L. Committee Report, supra note 33, ¶ 41. As people determining “the purposes and means of the processing of personal data,” Council Directive 95/46, supra note 1, art. 2(d), at 38, individuals using search engines could conceivably be characterized as data controllers, see Opinion of Advocate General Jääskinen, supra note 2, ¶ 81 & n.57.

    Return to citation ^
  36. ^ Martin Husovec, Should We Centralize the Right to Be Forgotten Clearing House?, Center for Internet & Soc’y (May 30, 2014, 1:28 PM) (quoting Hans Peter Lehofer, EuGH: Google muss doch vergessen - das Supergrundrecht auf Datenschutz und die Bowdlerisierung des Internets, e-comm (May 13, 2014), http://blog.lehofer.at/2014/05/eugh-google-muss-doch-vergessen-das.html [http://perma.cc/4YA5-X86V]), http://cyberlaw.stanford.edu/blog/2014/05/should-we-centralize-right-be-forgotten-clearing-house [http://perma.cc/ZH87-YC5F].

    Return to citation ^
  37. ^ Id. (emphasis omitted).

    Return to citation ^
  38. ^ Steve Peers, The CJEU’s Google Spain Judgment: Failing to Balance Privacy and Freedom of Expression, EU Law Analysis (May 13, 2014), http://eulawanalysis.blogspot.co.uk/2014/05/the-cjeus-google-spain-judgment-failing.html [http://perma.cc/T8QN-W2G2]; cf. Caro Rolando, How “The Right to Be Forgotten” Affects Privacy and Free Expression, IFEX (July 21, 2014), https://www.ifex.org/europe_central_asia/2014/07/21/right_forgotten [http://perma.cc/A54Z-2TG5] (outlining the various rights at stake).

    Return to citation ^
  39. ^ See EU Court Enshrines “Right to Be Forgotten” in Spanish Case Against Google, Reporters Without Borders (May 14, 2014), http://en.rsf.org/union-europeenne-eu-court-enshrines-right-to-be-14-05-2014,46278.html [http://perma.cc/6DE2-APLM].

    Return to citation ^
  40. ^ See How Search Works, Google, http://www.google.com/insidesearch/howsearchworks (last visited Oct. 26, 2014) [http://perma.cc/U4KK-B8Q8].

    Return to citation ^
  41. ^ H.L. Committee Report, supra note 33, ¶¶ 27–29. This may not have been the only permissible reading, but it did follow naturally from the text. The Advocate General proposed an alternative, narrower interpretation of the data controller provision suggesting that the Directive contemplated that the data controller have responsibility for the personal data it was processing, implying an awareness that what was being processed was personal data. Opinion of Advocate General Jääskinen, supra note 2, ¶¶ 82–83. Because Google’s searching functions cannot distinguish personal data, the Advocate General argued it was not proper to classify Google as a data controller. See id. ¶¶ 84, 89. However, this understanding was not based on the text of the Directive, previous case law, or even the intention of the parties at the time the Directive was written or passed (as search engines as such did not yet exist); instead, it was based on the views of a purely advisory working party formed after the Directive was signed. See id. ¶¶ 82–83, 88; Article 29 Working Party, European Commission, http://ec.europa.eu/justice/data-protection/article-29/index_en.htm (last visited Oct. 26, 2014) [http://perma.cc/ZH4E-VYPH] (noting that the Article 29 Working Party “has an advisory status and acts independently”).

    Return to citation ^
  42. ^ See, e.g., Council Directive 95/46, supra note 1, pmbl. ¶ 2, at 31 (“[D]ata-processing systems . . . must . . . contribute to economic and social progress, trade expansion and the well-being of individuals . . . .”); id. pmbl. ¶ 56, at 36 (“[C]ross-border flows of personal data are necessary to the expansion of international trade . . . .”).

    Return to citation ^
  43. ^ Id. art. 1(1), at 38.

    Return to citation ^
  44. ^ Jonathan Zittrain, Don’t Force Google to ‘Forget,’ N.Y. Times, May 14, 2014, http://www.nytimes.com/2014/05/15/opinion/dont-force-google-to-forget.html [http://perma.cc/M8B7-QM83].

    Return to citation ^
  45. ^ H.L. Committee Report, supra note 33, ¶ 8.

    Return to citation ^
  46. ^ O’Brien & York, supra note 34 (“Restrictions on free expression need to be considered, in public, by the courts, on a case-by-case basis, and with both publishers and plaintiffs represented, not via an online form . . . .”).

    Return to citation ^
  47. ^ See, e.g., Rolando, supra note 38. Critics point to some of Google’s early and controversial link removal decisions as signs of some of these negative consequences. For example, Google removed a link to an article published by The Guardian about a now-retired soccer referee who had been accused of lying about why he had awarded a penalty kick. See Mark Scott, Google Reinstates European Links to Articles from The Guardian, N.Y. Times, July 4, 2014, http://www.nytimes.com/2014/07/05/business/international/google-to-guardian-forget-about-those-links-right-to-be-forgotten-bbc.html [http://perma.cc/MJG3-GNFZ]. The paper complained, and Google eventually reinstated the links. See id. Google did not share who had requested the article be removed, why that individual made the request, or Google’s own rationale for removing or reinstating the links. See id. To critics, this episode highlighted the difficulty of applying the court’s vague balancing test and the dangers of letting these decisions happen in private, without having all parties represented or any public oversight and accountability. While this story ended happily (at least from the perspective of advocates for unrestricted information), critics doubt that a smaller news organization or website would have the clout, wherewithal, or resources to challenge Google’s decisions. See O’Brien & York, supra note 34.

    Return to citation ^
  48. ^ It is worth noting that while critics lament Google’s supposedly new power to determine what users do and do not see in their search results, this concept is in fact both the entire premise and purpose of Google. Google — like many other search engines — ranks and displays content based on over 200 factors other than pure relevance, including country of origin, previous browsing history, and freshness of content. See Algorithms, Google, http://www.google.com/insidesearch/howsearchworks/algorithms.html (last visited Oct. 26, 2014) [http://perma.cc/45NR-ZB5B]. Google has never been an unbiased party, and the company’s decisions affect everything users see in their search results. The critical literature is notably devoid of any analysis of why this additional decision point is so much worse.

    Return to citation ^
  49. ^ Council Directive 95/46, supra note 1, art. 6, at 40; see also id. art. 12, at 42 (“Member States shall guarantee every data subject the right to obtain from the controller: . . . the rectification, erasure or blocking of data . . . .” (emphasis added)).

    Return to citation ^
  50. ^ For example, the right to be forgotten derives from Article 12(b), which allows data subjects to request data deletion when “appropriate” and suggests that its list of grounds for deletion is nonexhaustive. Id. art. 12(b), at 42. Even less helpfully, Article 14(a) allows objections on “compelling legitimate grounds” and all but mandates case-by-case review by tying the validity of the objection to the subject’s “particular situation.” Id. art. 14(a), at 42.

    Return to citation ^
  51. ^ See Proposal for a Regulation of the European Parliament and of the Council on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data (General Data Protection Regulation), COM (2012) 11 final (Jan. 25, 2012), http://ec.europa.eu/justice/data-protection/document/review2012/com_2012_11_en.pdf [http://perma.cc/EJP5-34QV].

    Return to citation ^
  52. ^ See H.L. Committee Report, supra note 33, ¶ 30 (noting, with dismay, that the Regulation would provide a “right to erasure” not just against data controllers, but against all third parties); European Commission, supra note 18, at 2–4 (promoting the fact that the Regulation (1) will ensure that “non-European companies, when offering services to European consumers, must apply European rules,” id. at 2, (2) will shift the burden of proof to companies to prove that data must be retained, id. at 3, and (3) will impose fines on companies that do not respect the rules, id. at 4).

    Return to citation ^
  53. ^ Originally proposed in 2012 and approved by the European Parliament in 2014, the Regulation may receive final approval as early as late 2014 or early 2015. EU Legislative Process Updates, Wilson Sonsini Goodrich & Rosati, LLP, http://www.wsgr.com/eudataregulation/process-updates.htm (last visited Oct. 26, 2014) [http://perma.cc/SXW7-QY7N].

    Return to citation ^