'Consent as Friction' by Nikolas Guggenberger in. (2025) 66 B.C. Law Review comments
The leading technology platforms generate several hundred billion dollars annually in revenue through algorithmically personalized advertising—with pernicious effects on our privacy, mental health, and democracy. To fuel their data-hungry algorithms, these platforms have long conditioned access to their services on far-reaching authorizations, embedded in boilerplate terms, to extract their users’ data. Until recently, privacy-sensitive alternatives were unavailable—even for a premium. Users faced a stark choice: submit to surveillance or forgo digital participation. I term this business practice “surveillance by adhesion.”
In July 2023, however, the European Court of Justice ruled in Meta Platforms Inc. v. Bundeskartellamt that surveillance by adhesion violated the European Union’s General Data Protection Regulation. To comply with the EU’s new regulatory paradigm, the leading (predominantly American) platforms must fundamentally revise their business models by either abandoning personalized advertising or obtaining individuals’ informed consent. In practice, the EU’s stringent guard-rails—which mandate providing users with “real choice” beyond mere consent pop-ups and granular control—may render user consent so onerous to secure, precarious to sustain, restrictive to operationalize, and prone to litigation that they undermine the commercial viability of personalized advertising. Rather than empowering users to exercise control over their data, the consent mechanism may thus manifest as a vehicle for welcome friction, prompting a shift toward less invasive contextual advertising.
Building on these insights, this Article contends that U.S. policymakers and regulators should, and indeed can, likewise leverage consent as friction to undermine the economic viability of personalized advertising and other harmful surveillance-driven business models. This approach offers a pragmatic alternative to failed notions of user control over data, especially as democratic data governance too often remains beyond reach. Although the EU’s new regulatory paradigm offers one model, there are multiple avenues to harness consent as a source of friction across different legal contexts. In fact, state-level biometric privacy laws exemplify this strategy’s efficacy domestically. Their qualified consent requirements have thrown so much sand in the gears of biometric data collection and use that several leading technology companies have refrained from launching intrusive facial recognition applications altogether. By adopting this friction-based strategy, the Federal Trade Commission and state privacy enforcers can effectively establish potent data usage limitations.