From the report by the Advisory Council to Google on the Right to be Forgotten -
We were invited, as independent experts, to join the Advisory Council
to Google on the Right to be Forgotten following the Court of Justice
of the European Union’s ruling in Google Spain and Inc. vs. Agencia Española
de Protección de Datos (AEPD) and Mario Costeja Gonzalez C131/12 (“the
Ruling”) in May 2014. Google asked us to advise it on performing the
balancing act between an individual’s right to privacy and the public’s
interest in access to information.
This report summarizes our advice
to the company, which is based on several inputs:
- our own independent views and assessments;
- evidence we heard from experts around Europe during our seven-city
tour, some of whom were critical of the Ruling and others of whom
argued that the Ruling came to good conclusions;
- input provided by Internet users and subject matter experts via the
website www.google.com/advisorycouncil/;
- other materials we have reviewed, including European Court
of Human Rights case law, policy guidelines of news organizations,
and the Article 29 Working Party’s Guidelines on the Implementation
of the Ruling adopted on 26 November 2014. ...
We were convened to advise on criteria that Google should use in striking
a balance, such as what role the data subject plays in public life, or whether
the information is outdated or no longer relevant. We also considered the
best process and inputs to Google’s decision making, including input from
the original publishers of information at issue, as potentially important
aspects of the balancing exercise.
We have found the public discussion around the Ruling to be a valuable
contribution to an ongoing general debate about the role of citizen rights
in the Internet. If nothing else, this Ruling and the discussion around it have
raised awareness of how to protect these rights in a digital era. We hope
the recommendations that follow continue to raise that awareness.
2. Overview of the Ruling
The Ruling has been widely referred to as creating a “Right to be Forgotten.”
This reference is so generally understood that this Advisory Council was
convened to advise on the implementation of this right. In fact, the Ruling
does not establish a general Right to be Forgotten.
Implementation of the Ruling does not have the effect of “forgetting”
information about a data subject. Instead, it requires Google to remove
links returned in search results based on an individual’s name when those
results are “inadequate, irrelevant or no longer relevant, or excessive.” Google is not required to remove those results if there is an overriding
public interest in them “for particular reasons, such as the role played
by the data subject in public life.”
Throughout this report, we shall refer to the process of removing links in
search results based on queries for an individual’s name as “delisting”.
Once delisted, the information is still available at the source site, but
its accessibility to the general public is reduced because search queries
against the data subject’s name will not return a link to the source
publication. Those with the resources to do more extensive searches
or research will still be able to find the information, since only the link
to the information has been removed, not the information itself.
The legal criteria for removing content altogether from the underlying
source may be different from those applied to delisting, given the
publisher’s rights to free expression. If Google decides not to delist a link,
the data subject can challenge this decision before the competent Data
Protection Authority or Court.
3. Nature of the Rights at
Issue in the Ruling
The Ruling should be interpreted in light of the rights to privacy and
data protection, as well as rights to freedom of expression and access
to information. By referring to these rights, we invoke the conceptual
frameworks established in various instruments that outline and enshrine
fundamental freedoms and rights in Europe.
The right to privacy is enshrined in Article 7 of the Charter of Fundamental
Rights of the European Union (henceforth the Charter) and in Article 8 of
the European Convention on Human Rights (henceforth the Convention).
It affirms respect for private life and freedom from interference by the
public authorities except in accordance with the law.
The right to data protection is granted by Article 8 of the Charter. It ensures
that data are processed fairly, for specified purposes, and on the basis
of consent or some other legitimate basis laid down by law. It also ensures
that data which have been collected can be accessed and rectified. Privacy
and data protection are fundamental rights.
Freedom of expression and information are enshrined in Article 10
of the Convention and Article 11 of the Charter. These rights establish that
expressing ideas and holding opinions as well as receiving and imparting
information and ideas, regardless of frontiers, are fundamental rights.
The Ruling invokes a data subject’s right to object to, and require cessation
of, the processing of data about himself or herself. This right exists
regardless of whether the processing at issue causes harm or is prejudicial
in some way to the data subject.
The Court of Justice of the European Union (CJEU) noted in the Ruling
that the data subject’s fundamental rights “override, as a rule, not only
the economic interest of the operator of the search engine but also the
interest of the general public in finding that information upon a search
relating to the data subject’s name.” However, the Court acknowledged
that, for particular reasons, the public will have an interest in continued
ability to find the link by searching on the data subject’s name. Therefore,
the operator of the search engine is directed to engage in a balancing test
to determine whether the data protection rights of the data subject are
outweighed by “the preponderant interest of the general public in having,
on account of inclusion in the list of results, access to the information
in question.” The question of whether the data subject experiences harm
from such accessibility to the information is in our view relevant to this
balancing test.
Assessing harm to the data subject must be done on an ethical, legal,
and practical basis, which can be understood based both on CJEU case
law interpreting the Charter and on European Court of Human Rights
(ECHR) case law interpreting the Convention. The scope of rights and
harms outlined in Article 8 of the Convention have been well analyzed and
developed in case law outside the data protection context, particularly law
concerning defamation and privacy claims. The animating values in those
cases often concern personal honor, dignity, and reputation as well
as the protection of sensitive or intimate personal information. Similar
values animate the case law that bounds the scope of data protection
rights under Article 8 of the Charter. As a result, the Ruling should be read
in light of this ongoing dialog between the CJEU and the ECHR, and, where
relevant, case law of national higher courts, delineating the scope of,
and relationship between, privacy and expression rights. The ruling, while
reinforcing European citizens’ data protection rights, should not
be interpreted as a legitimation for practices of censorship of past
information and limiting the right to access information.
4. Criteria for Assessing
Delisting Requests
We identified four primary criteria on which we advise Google to evaluate
delisting requests from individual data subjects. None of these four criteria
is determinative on its own, and there is no strict hierarchy among them.
Furthermore, social or technical changes may cause these criteria to evolve
over time.
4.1. Data Subject’s Role in Public Life
As explicitly noted in the Ruling, the role an individual plays in public life
will weigh on the balancing act Google must perform between the data
subject’s data protection rights and the public’s interest in access to
information via a name-based search. The first step in evaluating a delisting
request should be to determine the individual’s role in public life. These
categorizations are not in themselves determinative, and some evaluation
along the other criteria laid out below is always necessary. However, the
relative weight applied to the other criteria will be influenced by the role
the individual plays in public life.
In general, individuals will fall into one of the following three categories:
- Individuals with clear roles in public life (for example, politicians,
CEOs, celebrities, religious leaders, sports stars, performing artists): delisting requests from such individuals are less likely to justify delisting,
since the public will generally have an overriding interest in finding
information about them via a name-based search.
- Individuals with no discernable role in public life: delisting requests
for such individuals are more likely to justify delisting.
- Individuals with a limited or context-specific role in public life (for
example, school directors, some kinds of public employees, persons
thrust into the public eye because of events beyond their control,
or individuals who may play a public role within a specific community
because of their profession): delisting requests from such individuals
are neither less nor more likely to justify delisting, as the specific
content of the information being listed is probably going to weigh
more heavily on the delisting decision.
Data subjects related to individuals playing a role in public life present
some interesting edge cases, as they may themselves play a role in public
life which can be significant. However, in similar cases, special attention
should be paid to the content of the delisting request, as the data subject’s
public role may be circumscribed. For example, there may be a strong
public interest in information about nepotism in family hiring.
4.2. Nature of the Information
4.2.1. Types of information that bias toward an
individual’s strong privacy interest
1. Information related to an individual’s intimate or sex life.
In general, this information will hold increased weight of privacy rights
in the balancing test against public interest. The exceptions will generally
be for individuals who play a role in public life, where there is a public
interest in accessing this information about the individual.
2. Personal financial information.
Specific details such as bank account information are likely to be private
and warrant delisting in most cases. More general information about
wealth and income may be in the public interest. For example, in some
countries, the salaries and properties of public employees are treated
as public information; stock holdings in public companies may be of
public interest; or there may be valid journalistic concerns in wealth and
income information, including investigations of corruption.
3. Private contact or identification information.
Information such as private phone numbers, addresses or similar
contact information,11 government ID numbers, PINs, passwords,
or credit card numbers will hold increased weight of privacy
rights in the balancing test against public interest.
4. Information deemed sensitive under EU Data Protection law.
Information revealing racial or ethnic origin, political opinions, religious
or philosophical beliefs, trade-union membership, health, or sex life may
all have specific privacy protections in Europe. However, when such data
relates to the role the data subject plays in public life, there can be a strong
public interest in accessing links to this information via a name-based search.
5. Private information about minors.
There is a special privacy consideration for children and adolescents
according to the United Nations Convention on the Rights of the Child.
6. Information that is false, makes an inaccurate association or puts the
data subject at risk of harm.
False information or information that puts the data subject at risk of harm,
such as identify theft or stalking, weighs strongly in favor of delisting.
7. Information that may heighten the data subject’s privacy interests
because it appears in image or video form.
4.2.2. Types of information that bias toward a public interest
1. Information relevant to political discourse, citizen engagement,
or governance.
Political discourse is strongly in the public interest, including opinions and
discussions of other people’s political beliefs, and should rarely be delisted.
2. Information relevant to religious or philosophical discourse.
Religious and philosophical discourse is strongly in the public interest,
including opinions and discussions of other people’s religious and
philosophical beliefs, and should rarely be delisted.
3. Information that relates to public health and consumer protection.
Information related to public health or consumer protection issues weighs
strongly against removal. For example, reviews of professional services
offered to the public at large may impact consumer safety; this value
is widely recognized in the context of journalistic exceptions. Today,
sources such as individual users on social media sites often provide this
type of information, more so than traditional journalistic sources.
4. Information related to criminal activity.
Data relating to offences or criminal convictions warrants special
treatment under EU Data Protection Law. Where specific laws relating
to the processing of such data provide clear guidance, these should
prevail. Where none applies, the outcome will differ depending on
context. The separate considerations of severity of the crime, the role
played by the requestor in the criminal activity, the recency and the source
of the information (both discussed below), as well as the degree of public
interest in the information at issue will be particularly relevant in assessing
these cases. The evaluation of the public interest in the delistings requested may differ depending on whether they concern a criminal
offender or victim of a criminal offense. Information regarding human
rights violations and crimes against humanity should weigh against delisting.
5. Information that contributes to a debate on a matter of general interest.
T
he public will have an interest in accessing individual opinions and
discussion of information that contributes to a public debate on a matter
of general interest (for example, industrial disputes or fraudulent practice).
The determination of a contribution to public debate may be informed
by the source criterion, discussed below, but once information about
a particular subject or event is deemed to contribute to a public debate
there will be a bias against delisting any information about that subject,
regardless of source.
6. Information that is factual and true.
Factual and truthful information that puts no one at risk of harm will weigh
against delisting.
7. Information integral to the historical record.
Where content relates to a historical figure or historical events, the public
has a particularly strong interest in accessing it online easily via a namebased
search, and it will weigh against delisting. The strongest instances
include links to information regarding crimes against humanity.
8. Information integral to scientific inquiry or artistic expression.
In some cases, removing links from name-based search results will distort
scientific inquiry; in those cases the information may carry public interest
valence. The artistic significance of content constitutes public interest
and will weigh against delisting. For example, if a data subject is
portrayed in an artistic parody, it will weigh in favor of a public interest
in the information.
4.3. Source
In assessing whether the public has a legitimate interest in links to
information via a name-based search, it is relevant to consider the source
of that information and the motivation for publishing it. For example, if the
source is a journalistic entity operating under journalistic norms and best
practices there will be a greater public interest in accessing the information
published by that source via name-based searches. Government
publications weigh in favor of a public interest in accessing the information
via a name-based search.
Information published by recognized bloggers or individual authors of good
reputation with substantial credibility and/or readership will weigh in favor
of public interest. Information that is published by or with the consent
of the data subject himself or herself will weigh against delisting. This
is especially true in cases where the data subject can remove the
information with relative ease directly from the original source webpage,
for example by deleting his or her own post on a social network.
4.4. Time
The ruling refers to the notion that information may at one point be
relevant but, as circumstances change, the relevance of that information
may fade.
This criterion carries heavier weight if the data subject’s role
in public life is limited or has changed, but time may be a relevant criterion
even when a data subject’s role in public life has not changed. There are
types of information for which the time criterion may not be relevant to a
delisting decision—for example information relating to issues of profound
public importance, such as crimes against humanity.
This criterion will be particularly relevant for criminal issues. The severity
of a crime and the time passed may together favor delisting, such as
in the case of a minor crime committed many years in the past. It could
also suggest an ongoing public interest in the information—for example
if a data subject has committed fraud and may potentially be in new
positions of trust, or if a data subject has committed a crime of sexual
violence and could possibly seek a job as a teacher or a profession
of public trust that involves entering private homes.
Time may also weigh on determining the data subject’s role in public life.
For example, a politician may leave public office and seek out a private life,
or a CEO may step down from his or her role, but information about his
or her time in that role may remain in the public interest as time goes on.
This criterion may also weigh toward approving delisting requests for
information about the data subject’s childhood.