By Tisunge (Sunga) Mkwezalamba*
Part II: Interpreting the Right to be Forgotten
This section provides a case summary of Google Spain v. González. In its holding, the Court of Justice of the European Union determined that search engines are responsible for the removal of personal data upon request, regardless of how the data was obtained, so long as processing of the personal data would not be in compliance with the principles of Directive 95/46/EC.
(a). Google Spain v. González
In 2010, Mario Costeja González, a Spanish citizen, filed a complaint against a local Spanish newspaper with the Spanish Data Protection Authority (AEPD). González sought to have the newspaper remove a disfavoring news article published in 1998 regarding an auction of his home in connection with the recovery of his debts. González argued that the matter concerning his debts had been fully resolved for a number of years, and therefore, the contents of the article, which were his personal data under the definition of Article 2(a) of Directive 95/46/EC, were now irrelevant and should be erased and blocked pursuant to Article 12(b) of the same directive. González named Google Spain and Google as co-defendants in his complaint. González argued that Google should remove all links to the article pursuant to the same provisions of the directive.
The AEPD rejected the complaint as it related to the newspaper, reasoning that the newspaper was legally justified in publishing information regarding the auction because it took place under the order of a government authority and was intended to achieve maximum publicity in order to secure as many bidders as possible. Astonishingly, the AEPD did not extend the same coverage that it afforded the newspaper to Google Spain and Google, and in fact ruled against them.
Google Spain and Google appealed the decision before Spain’s National High Court (Audiencia Nacional). Google argued that Directive 95/46/EC did not apply to Google Spain because Google Spain’s activities in the EU were advertising, which Google contended was not processing of personal data pursuant to the directive. Further, Google argued that even if the court found Google Spain to have engaged in the processing of personal data, the company was established outside of the directive’s jurisdiction.
Spain’s National High Court decided to stay the proceedings and referred questions to the Court of Justice of the European Union (CJEU) for a preliminary ruling regarding Directive 95/46/EC. The questions submitted to the CJEU were: (1) whether Google Spain was established in Spain within the meaning of Article 4(1)(a), and therefore subject to the directive; (2) whether Google’s activities in Spain fell under the Article 2(b) definition of processing of personal data; and (3) if the aforementioned questions were in the affirmative, whether Article 12(b) obligated Google to erase or block the processing of personal data lawfully published by a third party.
On May 13, 2014, the CJEU reached a decision against Google. The decision opened the floodgates for similar claims against large search engines.
Regarding whether Google Spain was established in Spain within the meaning of Article 4(1)(a), the court found that Google was subject to its jurisdiction because it was established in Spain through the activities of its subsidiary in a member state. The court reasoned that application of the directive did not require that Google Spain, the entity established in the EU, carry out processing. The court held that processing by the established entity was not a requirement so long as the established entity’s activities were closely linked to the processing activities.
Next, concerning whether Google’s activities fell under the Article 2(b) definition of processing of personal data, the court concluded that a search engine’s activity of finding information published or placed on the internet by third parties, indexing it automatically, storing it temporarily, and making it available to internet users according to a particular order of preference must be classified as “processing of personal data” within the meaning of Article 2(b), and the operator of the search engine must be regarded as the “controller” with respect to that processing, within the meaning of Article 2(d).
As a result, the court held that a search engine is obligated to remove personal data pursuant to Article 12(b) to comply with the rights in the directive regardless of whether the data was obtained lawfully. In its holding, the court expanded the scope of personal data that is to be erased or blocked outside of personal data that is particularly inaccurate or incomplete in nature. The court, through its decision that the right to be forgotten exists where the personal data is not in compliance with the rights of the directive, provided circumstances that it believed were in accordance with the principles of the directive. The court held that the right to be forgotten should be afforded when the personal data is “irrelevant or no longer relevant, or is excessive in relation to the purposes of the processing at issue carried out by the search engine, even if the information is not erased beforehand or simultaneously from those web pages, and even when its publication in itself on those pages is lawful.” Interestingly, the court found that its list of circumstances included disfavoring personal data such as the auction, which it found fell under relevance.
(b). Aftermath of González
Though the right to be forgotten previously did not explicitly exist in EU law, the CJEU interpreted it in Article 2(b) of Directive 95/46/EC. In its holding, the court expanded the definition of processing to include closely related activities such as advertising, and held that the activities of a search engine in providing links to websites with personal data constituted the processing of the personal data available through those links. Further, the court provided a list of circumstances where the right to be forgotten could be exercised, which included disfavoring information that was no longer relevant.
Most troublesome in the court’s interpretation of when the right could be exercised was its determination that search engines are to remove personal information that is lawfully obtained and privileged to the original publisher. Thus, if an individual wanted to remove his or her personal data online, he or she could simply apply to search engines, which are then to remove the link. Considering the number of individuals who use search engines to locate personal data, it only makes sense that individuals will be inclined to appeal to search engines directly.
As it relates to the GDPR, this is problematic because the language about the right to be forgotten is similar to the broad definition provided in González. Thus, search engines remain the misguided targets of the right to be forgotten.
Part III: Why the Right to be Forgotten is Bad Policy and Bad Law
In this section, I argue that the EU’s creation of a right to be forgotten is bad policy and bad law. First, the EU’s interpretation of the right to be forgotten is bad policy and bad law because its application does not remove access to the personal data that has been requested for removal. Second, it is likely to lead to instability in capital markets where investments and extensions of credit rely on personal data of an individual’s market participation. Third, as currently interpreted, the right compels large search engines to enforce the international right at their burden. This is significant because search engines then have to create a judiciary to judge EU law where only one precedent stands for a broad interpretation of a newly created privacy right. Lastly, the CJEU’s interpretation of the law disregards guaranteed freedoms, mainly the freedom of expression, in favor of the right to be forgotten.
1.The EU’s interpretation of the right to be forgotten does not remove access to the personal data that has been requested for removal.
The EU’s creation of a right to be forgotten is bad policy and bad law because it does nothing to actually restrict access to personal data. In a globalized society, search engines with the capabilities of Google provide regional services. However, access to the search engines’ services across regions remains possible. For instance, following the decision in González, no links to the newspaper article containing the auction of González’s home should have been available by searching González’s name in Google’s Spanish search domain URL (google.es). Despite the article’s removal from Google’s Spanish domain, the newspaper article remained available through Google’s US search domain URL (google.com). Essentially, once a search engine is required to remove the link, it is only the link that is removed, and the content remains. Further, other search engines will still provide links to the personal data that has been requested for removal. Thus, following González, a search through Yahoo! would have provided access to the article detailing the auction. Consequently, by the CJEU providing protections to the publisher and not search engines as it had done in González, the right to be forgotten is left for search engines to deal with.
There are also other concerns with the impracticality of the rule. These days, embarrassing information is shared on many mediums. At any moment, embarrassing content can be rapidly disseminated throughout the internet, reaching far and wide in a small amount of time. When content is disseminated rapidly throughout the internet, it is said to “go viral” or be “trending”. Locating personal data to de-link it becomes a long, painstaking process, particularly when the data can become undetectable or irretrievable due to the rate at which that information is shared. For instance, a trending photo that identifies an individual could be traced easily when those sharing the information share it with identifiers such as the name of the individual who the photo belongs to. However, once it is shared numerous times, that name may be misspelled or omitted, making it more difficult for search engines to locate it. This is the case in an age when embedding content rather than linking or sharing is gaining ground in digital information sharing. Further, it is unclear from the ruling in González and the text of GDPR as to what mediums controllers are supposed to remove content from and to what extent. Assume a video from a news channel discussing González’s auction was spliced into a video that is uploaded onto YouTube.com. Would Google, an owner of YouTube, be required to take down the entire video? These answers are still left unanswered, and are likely to lead to a slippery slope for search engines.
2. Removal of lawfully published personal data linking individuals to market decisions affects markets and democratic decision-making.
We live in a world where markets make decisions based on personal data. Thus, the availability of personal data, whether or not it is disfavoring, has significant value in the market. For example, decisions in the labor market are becoming increasingly influenced by personal data published willfully on social media sites. CareerBuilder, a US-based employment website, reported that more than 40 percent of employers research job candidates’ personal data on social media sites. Employers use the personal information available on social media accounts to evaluate candidates. Some may argue that employers only look for information that reveals a candidate’s moral character. However, CareerBuilder’s report found that employers are more concerned with information that supports a candidate’s qualifications for the job to ensure they are able to recruit the best available candidates. In a competitive job market, this is a good thing. If individuals are able to remove personal data when it is disfavoring or does not attest to their qualifications, such as personal data indicating that an individual did not work in the capacity his or her résumé attests, then employers may risk making costly employment decisions.
The most important reason why we need markets to have access to personal data relates to capital investments and extension of capital. Buyers need confidence in personal data available in the market to make purchasing decisions. Personal information is needed in order to ascertain the fair market value of an item, such as a vehicle’s history. The history of the vehicle is personal data because it identifies the vehicle’s conditions under its prior owners. Thus, buyers should be able to retrieve all information necessary regarding their purchase before making a decision. If the right to be forgotten is exercised to detach a seller’s identity from a market decision because it is disfavoring, such as González’s home auction from bad debts, then the buyer will not be able to make a well-informed decision.
Alexa, a company that provides web traffic and data analytics, publishes a list of the most frequently used websites. Of the top ten, three are search engines. This is significant because it indicates how much individuals trust the information available on search engines. In 2011, more than 75 percent of individuals used search engines to find local business information. Thus, proponents of enforcing or creating the right to be forgotten seem to overlook how reliant individuals are on the information available on search engines. As it relates to making decisions in the market, this is of particular concern where individuals are likely to perform their own market research to save costs. For instance, if a homeowner is interested in purchasing the services of a lawn care business owner they may be interested in conducting market research on previous works of the lawn care provider before contracting for her services. Researching all potential lawn care providers in her area will be time consuming. Thus, the homeowner is likely to visit websites that offer reviews of lawn care providers. If the lawn care owner has received unsavory reviews on her work and exercises her right to be forgotten arguing that the reviews are disfavoring and excessive, the information will be removed and the buyer will be unable to make an informed decision.
Further, by extending the right to be forgotten to personal data such as debt history, creditors could be less willing to extend credit. Creditors extend credit based on personal data regarding an individual’s ability to repay them with interest. A history of bad debts is a type of personal information that creditors rely on. By allowing individuals to remove personal data relating to bad debts that have already cleared, as was the case in González, creditors may suffer if they extend credit to individuals who are unable to repay them.
In addition to the necessity of personal data in the market, personal data is needed to make decisions in matters where an individual’s moral character is significant. In a democratic society, we rely on a person’s past to make judgments as to whether it is supportive of a high position in society. For instance, individuals who run for public office must provide access to some of their public records. In addition, lawyers have to complete a character and fitness report where they disclose their personal data, whether or not it is disfavoring. If the public no longer has access to or cannot rely on the information made available, we will be unable to make informed decisions and distrust the democratic process.
3. The EU’s interpretation of the right to be forgotten asks search engines to enforce EU privacy law and risk liability for content published by third parties.
A quick overview of the resources Google has added in response to González helps demonstrate the burdensome consequences of the decision. In response to González, Google has had to employ a full-time staff to review applications to de-link content sources. Google has also established an advisory council which includes legal, data protection, and human rights experts. The advisory board assists the staff responsible for effecting the right to be forgotten. Google has essentially been forced to serve as the judge and jury of what are still unclear privacy terms with not much case precedent. This is alarming considering Article 8 of the Charter of Fundamental Rights of the European Union (which affords the right to protection of personal data) states that compliance with the protection of personal data is subject to control by an independent authority. In Directive 95/46/EC, the regulation provides for data protection authorities who are to determine data privacy matters on multiple levels. Nonetheless, the language of GDPR and the González court’s opinion interpreting the right to be forgotten never considered that member states’ privacy authorities should act as the judiciary to determine whether applicants for the right to be forgotten should be afforded that right, which unfairly places the burden on search engines like Google.
The EU’s neglect of the text of the law will undoubtedly lead to an insurmountable case load appearing before Google’s newly appointed judiciary. Since the González ruling, Google has received nearly half a million requests to have personal data removed. Google’s costs go up if it fails to comply adequately with the brazen regulation. If Google does not de-link all content it can reasonably de-link, the GDPR could fine it an amount equivalent to two percent of Google’s annual worldwide turnover for noncompliance with the rule.
4. The right to be forgotten disregards other guaranteed freedoms, mainly freedom of the press.
Article 11 of the Charter of Fundamental Rights of the European Union provides for a right to freedom of expression. Free speech, alongside freedom of assembly, facilitates open discussion, which is seen as a necessity for a functioning democracy such as the US. However, as was the case in González, the right to privacy appears to trump the right to free speech in the EU.
Most search engines are based in the US, where the right to free speech is strong. A fundamental right, free speech is closely protected in the US, with few exceptions such as the famous “clear and present danger” rule. This is problematic to search engines acting as a judiciary because they would be required to exercise censorship that they do not support in the name of privacy rights granted in the EU. There is a danger that search engines, when faced with a country’s demand to comply with privacy regulations they do not agree with, may choose to say “no thanks” instead of continuing operations in that country. This occurred in 2010, when Google decided to abandon its operations in China because it fundamentally disagreed with China’s level of censorship. If in the future search engines decide not to do business in countries enforcing the right to be forgotten, citizens will be the ultimate losers.
 Id. at ¶ 94, 62; Council Directive 95/46, art. 12(b), 14(a) 1995 (L 281) 31 (EC) [hereinafter “Directive 95/46/EC”], available at http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:31995L0046&from=en.
 Press Release, CareerBuilder, 35 Percent of Employers Less Likely to Interview Applicants They Can’t Find Online, According to Annual CareerBuilder Social Media Recruitment Survey (May 14, 2015), available at http://www.careerbuilder.com/share/aboutus/pressreleasesdetail.aspx?sd=5%2F14%2F2015&id=pr893&ed=12%2F31%2F2015.
 John A. Enahoro and Jumoke Jayeoba, Value Measurement and Disclosures in Fair Value Accounting, 3(9) ASIAN ECON. & FIN. REV. 1170 (2013), available at http://www.aessweb.com/pdf-files/3(9)%201170-1179.pdf.
 As Media Habits Evolve, Yellow Pages and Search Engines Firmly Established As Go-To Sources for Consumers Shopping Locally, PR NEWSWIRE (June 13, 2011, 9:10 AM), http://www.prnewswire.com/news-releases/as-media-habits-evolve-yellow-pages-and-search-engines-firmly-established-as-go-to-sources-for-consumers-shopping-locally-123740559.html.
 European Privacy Requests for Search Removals, GOOGLE, https://www.google.com/transparencyreport/removals/europeprivacy/?hl=en (last updated Mar. 22, 2016). Since last visited on March 22, 2016, Google has evaluated 1,420,812 URLs for removal based on 406,329 requests. Google has removed 42.6 percent of those URLs.