In May this year, the European Court of Justice delivered its seminal ruling in Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González.

Following a referral to it by the National High Court of Spain, the Court of Justice ruled that under the European Data Protection Directive (Directive 95/46), an internet search engine operator (such as Google and Microsoft’s “Bing”)  is responsible for the processing that it carries out of personal data which appear on web pages published by third parties. As such, where the search results for an individual’s name include a link to a web page which contains such personal data on them, that individual – as the “data subject” – can approach the search engine operator directly to request the removal of personal data which are inadequate or excessive for the purposes for which they are processed, which have become irrelevant, or which have not been kept up-to-date.

Balancing exercise

The Court sought to carry out a balancing exercise between, on the one hand, the data subject’s rights to privacy and data protection, and on the other hand both the economic interests of the “data controller” (i.e.  the search engine operator), and the freedom to receive information which is afforded to web users by Article 10 of the European Convention on Human Rights. It concluded that where a data subject establishes that he has a right to block the processing of inadequate, irrelevant, excessive out of date or unnecessary personal data, as a general rule, this overrides both the economic interests of the operator, and the interests of the public in having access to that information. However, the Court observed that this would not be the case where, in view of the role played by the data subject in public life, the interference with his fundamental rights was justified by a greater interest of the general public in having access to the information in question. Accordingly, search engine operators are now required to undertake a balancing exercise of their own, deciding in each case whether public interest considerations outweigh the data subject’s right to have certain personal data removed from its results.

Where the data subject’s request is not granted by the search engine operator, the individual is able to bring the matter before national data protection authorities (in England and Wales, this means the Information Commissioner’s Office) or the Court, both of which have the power to order the removal of personal data.

The Court of Justice’s ruling on Directive 95/46, which has come to be known as the “right to be forgotten”, has undoubtedly come as a shock to the system for Google, bringing home the far-reaching nature of its data protection obligations in the European Union, and compelling it to establish a system for dealing with removal requests. The sheer number of such requests to date has been extraordinary. In the first six months since the ruling (as of 15 December 2014), Google is reported to have received 185,745 requests, and to have evaluated 667,079 URLs (the greatest number of which have been from Facebook) in order to decide whether their removal from its search results would be appropriate. The highest number of requests has come from France – historically a State that has afforded significant protection to individuals’ rights to privacy – with 37,118 requests having been made in relation to 120,245 URLs. Behind France and Germany, requests from the UK represent the third highest, with 23,889 requests said to have cited a total of 91,384 URLs.

It should be noted that although, following González, the provisions of Directive 95/46 are applicable to any search engine accessible via the EU, Google limits its removals to its European specific sites, such as ‘google.co.uk’. It has not removed links from search results at ‘google.com’, arguing that the Directive is not applicable to the US (the user base which ‘google.com’ is said to cater for), and other non-EU jurisdictions.  Google has been criticised for this, given that ‘google.com’ and ‘google.co.uk’ for example, are equally accessible from within the EU. As yet, Google has not explained its stance properly on these territorial issues, although some argue that it is only by links being removed from all of Google’s search engine sites that it can truly comply with the requirements of the Directive.

Quite apart from the volume of requests, it is the question of how the required balancing act should be conducted that is perhaps causing the greatest difficulty for Google. Although ruling on the point of principle, the Court of Justice did not provide any detailed analysis or guidance as to how the public’s right to access information online should be weighed, by the search engine operator, against those of the individual. Concerns have therefore been raised as to how Google is making this evaluation, and the decision as to what the public can access via its search engine.

The public interest?

In many cases, the decision appears to be relatively straightforward. Google cites the example of a victim of crime in Italy, who requested the removal of 3 links to web pages that discuss the crime, which took place decades ago. Another example is given of a man from the UK, who sought the removal of a link to a news summary of his conviction in the local magistrate’s court, where that decision is now deemed “spent” under the Rehabilitation of Offenders Act 1974 (a factor which appears to be given considerable weight). In both of these instances, the links were removed by Google.

However, in other cases the position is not as straightforward, and requires a keen appreciation of a factual background which may be complicated and convoluted to explain. For example, a doctor from the UK is said to have requested that Google remove more than 50 links to press articles about a botched procedure. Google removed three pages containing personal information about the doctor, but which did not mention the procedure. It determined, however, that the remaining links to press reports on the incident should remain in search results. It is unclear how Google came to make this distinction, but its decision appears to derive from a close analysis of the personal data on the various webpages. In another case, Google is reported to have granted a request by an individual named on the membership list of a far-right party, where that person was said no longer to hold such views.

Google has stated that its removals team “weigh whether or not there’s a public interest in the information remaining in [its] search results”, and that they “look at each page individually and base decisions on the limited context provided by the requestor and the information on the webpage”. However, the question arises as to what “public interest” Google is considering in this context, and whether it is correct test to be applying. Following González, Directive 95/46 suggests a narrow public interest test, namely that the public interest in the information was only relevant where the data subject played a role in public life, and that search engines cannot rely upon the “journalistic” exemption from the Directive.  However, some data subjects, who do not have such a role in public life, appear to have had their requests refused by Google, which suggests that in fact Google may in fact be applying  a wider test for “public interest”. There appears to be an inherent tension between the EU’s Data Protection Principles, which the UK Courts must apply, the Data Protection Act itself, with its slightly wider public interest test, and the still wider principles being applied by Google.

Google is clearly still adapting to this new regime and has yet to settle fully its policies and procedures, in particular as regards the test for “public interest”. Although Google notes that it “must consider the rights of the individual as well as public interest in the content”, it acknowledges: “This obligation is a new and difficult challenge for us, and we’re seeking advice on the principles Google ought to apply when making decisions on individual cases”.

Clarification needed

There are undeniably issues that require urgent clarification, given that the Court of Justice’s decision has produced such a wide obligation on Google to determine what should remain in its search results, and what should not.

On 26 November 2014, the Article 29 Data Protection Working Party – an independent European advisory body set up under the Directive – published its Guidelines on the implementation of González. The Guidelines provide useful comment on the territorial effect of a de-listing decision, appearing to confront Google’s position in this respect and suggest that the distinction Google draws between its websites is flawed:

 “…limiting de-listing to EU domains on the grounds that users tend to access search engines via their national domains cannot be considered a sufficient means to satisfactorily guarantee the rights of data subjects according to the ruling. In practice, this means that in any case de-listing should also be effective on all relevant domains, including .com”.

The Working Party also addressed what has been a major issue for some individuals to have had links removed: Google’s practice of contacting website editors to inform them that those links have been blocked (in some cases, prompting those editors to publishing further information about the data subject, thereby undermining the very premise of the data subject’s request). Not only did the Working Party observe that “there is no legal basis for such routine communication [to the third party website] under EU data protection law”, but it specifically commented that in general, search engine operators should not inform the editors of pages affected by their removals.

Common criteria for data protection compliance

The Working Party also set out a list of “common criteria”, to be used by Member States’ national data protection authorities (“DPAs”) in deciding whether data protection law has been complied with. These criteria are (with no single criterion being determinative by itself):

  1. Does the search result relate to a natural person – i.e. an individual? And does the search result come up against a search on the data subject’s name?
  2. Does the data subject play a role in public life? Is the data subject a public figure?
  3. Is the data subject a minor?
  4. Is the data accurate?
  5. Is the data relevant and not excessive?
    1. Does the data relate to the working life of the data subject
    2. Does the search result link to information which is allegedly constitutes hate speech/slander/libel or similar offences in the area  of expression against the complainant?
    3. Is it clear that the data reflect an individual’s personal opinion or does it appear to be verified fact?
  6. Is the information sensitive in the meaning of Article 8 of the Directive?
  7. Is the data up to date? Is the data being made available for longer than is necessary for the purpose of the processing?
  8. Is the data processing causing prejudice to the data subject? Does the data have a disproportionately negative privacy impact on the data subject?
  9. Does the search result link to information that puts the data subject at risk?
  10. In what context was the information published?
    1. Was the content voluntarily made public by the data subject?
    2. Was the content intended to be made public? Could the data subject have reasonably known that the content would be made public?
  11. Was the original content published in the context of journalistic purposes?
  12. Does the publisher of the data have a legal power – or a legal obligation – to make the personal data publicly available?
  13. Does the data relate to a criminal offence?

The Working Party states that it “strongly encourages the search engines to provide the delisting criteria they use”. At the time of writing, there is no indication as to whether Google will disclose its own criteria in full, and provide greater transparency to its review of URLs (at present, Google does not appear to be providing detailed reasons as to why it has decided to remove, or to maintain, any particular link). If it does, it will be interesting to see the extent to which the Working Party’s 13 criteria are taken into account, and the weight that is being given to each. Although the Working Party’s status is merely advisory, and its criteria are described only as a “flexible working tool”, their impact may prove to be significant for Google. If a search engine operator refuses to remove a link, a data subject may appeal to his or her Member State’s DPA. As the DPA “will apply” those criteria “on a case-by-case basis” to handle such complaints, Google may well find itself on the wrong side of DPA decisions – or indeed back in Court – if it strays too far from these criteria in conducting its own reviews.

Carter-Ruck has advised a number of its clients in relation to the EU “right to be forgotten”.

Blog arrow-right-alt