'Privacy's Trust Gap' by Neil M. Richards and Woodrow Hartzog in
Yale Law Journal (Forthcoming)
comments
It can be easy to get depressed about the state of privacy these days. In an age of networked digital information, many of us feel disempowered by the various governments, companies, and criminals trying to peer into our lives to collect our digital data trails. When so much is in flux, the way we think about an issue matters a great deal. Yet while new technologies abound, our ideas and thinking—as well as our laws—have lagged in grappling with the new problems raised by the digital revolution. In their important new book, Obfuscation: A User’s Guide for Privacy and Protest (2016), Finn Brunton and Helen Nissenbaum offer a manifesto for the digitally weak and powerless, whether ordinary consumers or traditionally marginalized groups. They call for increased use of obfuscation, the deliberate addition of bad information to interfere with surveillance; one that can be “good enough” to do a job for individuals much or even most of the time. Obfuscation is attractive because it offers to empower individuals against the shadowy government and corporate forces of surveillance in the new information society. While this concept represents an important contribution to the privacy debates, we argue in this essay that we should be hesitant to embrace obfuscation fully.
We argue instead that as a society we can and should do better than relying on individuals to protect themselves against powerful institutions. We must think about privacy instead as involving the increasing importance of information relationships in the digital age, and our need to rely on (and share information with) other people and institutions to live our lives. Good relationships rely upon trust, and the way we have traditionally thought about privacy in terms of individual protections creates a trust gap. If we were to double down on obfuscation, this would risk deepening that trust gap. On the contrary, we believe that the best solution for problems of privacy in the digital society is to use law to create incentives to build sustainable, trust-promoting information relationships.
We offer an alternative frame for thinking about privacy problems in the digital age, and propose that a conceptual revolution based upon trust is a better path forward than one based on obfuscation. Drawing upon our prior work, as well as the growing community of scholars working at the intersection of privacy and trust, we offer a blueprint for trust in our digital society. This consists of four foundations of trust—the commitment to be honest about data practices, the importance of discretion in data usage, the need for protection of personal data against outsiders, and the overriding principle of loyalty to the people whose data is being used, so that it is data and not humans that become exploited. We argue that we must recognize the importance of information relationships in our networked, data-driven society. There exist substantial incentives already for digital intermediaries to build trust. But when incentives and markets fail, the obligation for trust-promotion must fall to law and policy. The first-best privacy future will remain one in which privacy is safeguarded by law, in addition to private ordering and self-help.
'Contracting Over Privacy: Introduction' by Omri Ben-Shahar and Lior Strahilevitz in (2016) 43(2)
Journal of Legal Studies introduces
papers presented at the symposium Contracting over Privacy, which took place at the Coase-Sandor Institute for Law and Economics at the University of Chicago in fall 2015. The essay highlights a quiet legal transformation whereby the entire area of data privacy law has been subsumed by consumer contract law. It offers a research agenda for privacy law based on the contracting-over-privacy paradigm.
The authors comment
What are the legal implications of the classification of privacy notices as enforceable consumer contracts? For firms, the contractual nature of privacy notices ensures two beneficial functions. First, privacy notices are deployed to shield firms against liability for data privacy practices that, absent consumer consent, would violate privacy laws. For example, absent consent, Gmail’s practice of scanning contents of users’ e-mail messages would be a violation of the Wiretap Act, and Facebook’s practice of identifying users in uploaded photos would be a violation of state privacy laws. The contractual status of privacy notices means that users grant consent to these practices and thus provide firms a critical safe harbor.
The second function that privacy notices perform is the assurance for consumers that some uses of the data, which are otherwise permissible even without consent, would not occur. For example, firms and websites may keep logs of customers’ activity, but they can promise in their privacy notices not to do so. If privacy notices are contracts, such promises are binding, and their breach would be actionable. Moreover, the FTC can (and does) treat breaches of these promises as deceptive trade practices. Avowing such potential liability is a credible way for firms to entice hesitant consumers to engage with them. Firms dealing with sensitive content, like adult websites, indeed make explicit and clear promises to limit data sharing with third parties, and cloud-computing sites make explicit promises to follow stringent data security standards (Marotta-Wurgler 2016).
The contractual nature of privacy notices has significant implications for lawmakers working to design statutory privacy protections. The first implication is for the design of default rules. If statutory privacy rights are merely default rules, lawmakers should anticipate wholesale opt outs. Firms that develop business models that are constrained by statutory privacy rules would post privacy notices that effectively override these rules.
The powerful incentives of firms to induce their customers to give up their privacy rights also suggests that the choice between opt-in and opt-out schemes is of less importance than people usually assume. Opt-in schemes are thought to be more protective, because they require firms to get consumers’ affirmative consent to override the pro-consumer status quo. Opt-out schemes, by contrast, put the burden on consumers to initiate the exit from the pro-business status quo. Recent FCC regulations, for example, present the shift to an opt-in regime as a meaningful step toward more privacy protection, as this regime requires consumers’ explicit consent before collecting sensitive data such as geographical location or financial information. But firms are very good at getting consumers to opt in when doing so furthers the businesses interest (Willis 2013), and businesses are able to ask consumers repeatedly to change their minds if they initially resist information sharing. If indeed firms elicit such consumer consent with great ease, the opt-in framework makes little difference.
Once again, consumers may so easily agree to opt in, or fail to opt out, because of lack of information. Informed consumers might refuse to opt in or might initiate their own opt outs. These consumers would walk away from firms that refuse to provide the statutory privacy protections that they demand. Uninformed consumers, by contrast, would stick with any default rule. In such an environment of imperfect information, designing optimal default rules has to account for two separate concerns. First, it has to recognize that there are consumers who do care and who would seek to opt out of an undesirable default rule. For some, the default rule could be insufficiently protective, and they would look for more protection. For others, it would be too protective, and they would prefer to waive the protection for a price discount. These opt outs create transactions costs (the cost of becoming informed about the default rule as well as the cost of contracting around it), and a well-designed default rule has to minimize such costs. But the design of the default rule has to recognize, in addition, that many consumers would remain uninformed about the default rule and refrain from opting out, regardless of its content. For this group the default rule is sticky, and it ought to be designed with an eye to maximizing the value of the transaction. This is a general insight into the optimal design of default rules in consumer contracts: it has to meet two criteria—minimizing the cost of opt outs and maximizing the value of transactions when opt outs do not occur (Bar-Gill and Ben-Shahar 2016).
An additional implication of the contractual nature of privacy notices is the role of disclosures. Contracts over privacy—like any other consumer standard-form contract—are often long and complex. Is there a way to make such contracts simpler? Can the law require firms to present consumers pared-down versions of these privacy notices that would effectively inform consumers of the privacy risks? These questions have risen to the fore of consumer protection law in many areas, as regulators and commentators spend much effort to design simpler, smarter, and user-friendlier disclosures. In the privacy area, the proposals to utilize best practices in the presentation of privacy notices have been widely embraced, and more radical suggestions to use “nutrition facts”–type warning boxes are also intuitively advocated. But would such efforts have the desired effect on informing consumers’ choices? There is some evidence that the answer is no (Ben-Shahar and Chilton 2016) and that the use of the privacy notice to engender trust may be limited (Martin 2016).
In the end, then, the law and economics of contracting over privacy differs only in detail, but not in principle, from the law and economics of consumer contracts. Courts overwhelmingly treat them in the same way, and for good reasons. Consumers’ consent may be ill-informed, but regulatory alternatives might be worse. Consumer contract law has tools to combat overreaching by firms, and these tools—rather than superfluous notions of heightened disclosure or informed consent—ought to guide privacy protection. Such tools allow courts to strike down intolerable provisions, and in a separate article we propose to deny firms the advantages that they bury in cryptic boilerplate (Ben-Shahar and Strahilevitz 2016).
Accordingly, the papers from the symposium Contracting over Privacy collected in this issue examine general questions of contract formation, design, interpretation, and extracontractual norms and trust—all in the context of privacy. Privacy is not sui generis; it is instead a valuable laboratory to examine the evolution of contract law in the digital era.