'The New American Privacy' by Richard Peltz-Steele in (2013) 44(2)
Georgetown Journal of International Law argues that
Conventional wisdom paints U.S. and European approaches to privacy at irreconcilable odds. But that portrayal overlooks a more nuanced reality of privacy in American law. The free speech imperative of U.S. constitutional law since the civil rights movement shows signs of tarnish. And in areas of law that have escaped constitutionalization, such as fair-use copyright and the freedom of information, developing personality norms resemble European-style balancing. Recent academic and political initiatives on privacy in the United States emphasize subject control and contextual analysis, reflecting popular thinking not so different after all from that which animates Europe’s 1995 directive and 2012 proposed regulation. For all the handwringing in the United States over encroachment by anti-libertarian EU regulation, a new American privacy is already on the rise.
Peltz-Steele comments that -
Thinking about privacy is in vogue now in academic circles around
the world. Unexceptionally, U.S. scholars and advocates have been
eager to systematize diffuse musings and reconstruct privacy as rational
and sturdy scaffolding for law and regulation. Exceptionally, U.S.
policymakers must fit this reconstructed privacy into an existing superstructure
of civil and economic liberties. That superstructure has been
molded and in places made rigid by the same social developments that
shaped U.S. constitutional law in the twentieth century. The problem is
more one of legal architecture than of public will, and U.S. researchers such as Helen Nissenbaum and Daniel Solove are laying the groundwork
to tackle the project.
Professor Solove posited a sixteen-category taxonomy of information
activities that can harm data subjects. He theorized that if privacy
harms can be clearly articulated, then lawmakers can work back to
define and disincentivize the information practices that result in those
harms. Among the potentially injurious activities, and key areas of
policy discussion in the information age, are oft hand-in-hand surveillance
and secondary use. Both were at issue, for example, in the
recent uproar over Google’s privacy policy revision, by which Google
dropped information-sharing barriers across its various platforms, such
as search engine, electronic mail, and location mapping. This “surveillance”
of user activity allows Google to construct profiles of its users
with a level of intimate familiarity that makes some uncomfortable.
Searches for information about sexual fetishes or venereal diseases are
not the kind of data a user might wish to have associated with her or his
personal identity and home and electronic addresses.
Amplifying qualms over surveillance is the fear of secondary use (and
tertiary use, etc.), that is, the use of information for purposes unrelated
to its initial harvesting. A user might not object to Google’s use of
location mapping to enhance search results for a “florist.” But the
user might be surprised and uncomfortable when an advertising bot a
week later proposes a dating service upon the perceptive gamble that
the twenty-year-old who sought a florist in August would soon be in the
market for a new romantic partner. The situation is not much improved
by knowing that the aforementioned intimate details are part of the same data profile. Google itself is not in the data brokering business
at present, but surveillance and secondary use may result in painful and
invasive privacy violations with real social and financial consequences
when intimate personal profiles are sold wholesale for unrestricted
downstream applications—say, to a potential employer or insurer.
Professor Nissenbaum posited a more elaborate theory of “contextual
integrity” that examines the context in which privacy is implicated
relative to the norms that animate the information use. Her complex
and thoughtful taxonomy defies easy summary. To oversimplify nevertheless,
she outlined four constructs that define context: the role of the
actor in context, such as journalist; the activity in context, such as news
reporting; the social norms that govern in the context, such as the use
of quotation marks to indicate a subject’s own words; and the values
that operate in the context, such as objectivity. Nissenbaum further
outlined four parameters of informational norms: context, such as a
newspaper’s front page; actors, that is, the identity of the information
senders, the receiver, and the data subject; attributes of the information,
such as the physical appearance of a data subject; and most
importantly, transmission principles, including customary and articulated
constraints on information transmission, such as a reporter’s
promise of non-attribution.
The analytical trigger in the Nissenbaum approach is a change in the
context of information use, as determined by a change in the constructs
that define context. A change—say the journalist decides to
use a deep-background interview with a corporate whistleblower to put
words in the mouth of a fictional character in a screenplay—requires
that the new use be tested for consistency with the original parameters
of informational norms. The deep-background agreement, a transmission
principle in the initial disclosure of information, contemplated
no use of the data subject’s words, regardless of the speaker. For that
and various other reasons, contextual integrity is compromised. Lawmakers
may choose to define an invasion of privacy according to such a
compromise of contextual integrity.
Solove’s and Nissenbaum’s creative approaches point to similar results because both are merely tools to articulate existing value
systems. A public library’s database of patron checkouts furthers free
intellectual inquiry and efficient management of a shared resource.
Thus transfer of personal information for national security investigations
(surveillance), or sale of data for commercial profiling (secondary
use), violates privacy rights, whether framed as an aversion of
injurious consequence or as a compromise of contextual integrity.
Within any one cultural tradition, be it American, French, or another,
the proper employment of each approach aids in the detection of a
violation of social norms. The violation then may or may not be used to
demarcate a violation of law or civil rights.
Crucially, Solove and Nissenbaum both reject what Solove termed
“the secrecy paradigm” in favor of a contextual approach. This
divergence from convention exemplifies the resemblance of these
approaches to those of the DPD and proposed regulation in the EU.
The secrecy paradigm, which is a controlling norm in trade secret
law, posits that only secrets are legally protectable; information once
disclosed is fair game in the public sphere. The DPD similarly
rejected the deceptively simple dichotomy of the secrecy paradigm by
persisting in the regulation of data use after a subject’s voluntary
disclosure. The context of initial disclosure and the ongoing contexts
of information use, including downstream injury, are defining features
of both Solove’s and Nissenbaum’s analyses. Just as the DPD newly
emphasized disclosure and consent for information practices when
persons remain identifiable, Nissenbaum posited that factors such as
notice, consent, and redaction may serve to maintain contextual integrity.
In toughening the requirement of explicit consent and allowing
a sort of consent revocation through the device of the right to be
forgotten, the proposed regulation is only further consistent with the
concepts of harm-aversion and contextual integrity.
Solove acknowledged that an approach to privacy predicated on
extant values might require that the Supreme Court reconsider its
commitment to the secrecy paradigm —which it might. In present jurisprudence under the U.S. Fourth Amendment, the font of
constitutional privacy, the government can dip deeply into personal
information held by third parties, such as banks and telephone companies,
because the data are regarded as already disclosed. The concept
carries over into the civil context where, for example, the secrecy
paradigm is expressed through the tortious invasion of privacy requirement
that information have been guarded as secret (like in trade secret
law). Voluntary disclosure furthermore may manifest in tort through
a defense of consent (to intentional torts) or comparative fault (to
negligence torts). But in a recent case in which the Court, on narrow
grounds, reproved the covert installation of a GPS tracking device,
Justice Sotomayor hinted that a reconsideration of the dichotomy
might be in the cards. The decision in general confirmed the Court’s
willingness to adapt the Fourth Amendment to new technologies,
and GPS tracking is plainly “surveillance” in Solove’s terms. Writing in
concurrence, Justice Sotomayor acknowledged that GPS tracking can
accumulate “a wealth of detail about [a subject’s] familial, political,
professional, religious, and sexual associations,” and that such power is
“susceptible to abuse” — which is to say, compromised contextual
integrity may result in injury. She concluded: “[I]t may be necessary to
reconsider the premise that an individual has no reasonable expectation
of privacy in information voluntarily disclosed to third parties.”