'Consent to Behavioural Targeting in European Law - What are the Policy Implications of Insights from Behavioural Economics?' (Amsterdam Law School Research Paper No. 2013-43) by Frederik J. Zuiderveen Borgesius
comments that
Behavioural targeting is the monitoring of people’s online behaviour to target advertisements to specific individuals. European law requires companies to obtain informed consent of the internet user before they use tracking technologies for behavioural targeting. Other jurisdictions also emphasise the importance of choice for internet users. But many people click ‘I agree’ to any statement that is presented to them. This paper discusses insights from behavioural economics to analyse problems with informed consent to behavioural targeting from a regulatory perspective. What are the policy implications of insights from behavioural economics in the context of behavioural targeting? Two approaches to improve regulation are explored. The first focuses on empowering the individual, for example by making informed consent more meaningful. The second approach focuses on protecting the individual. If aiming to empower people is not the right tactic to protect privacy, maybe specific prohibitions should be introduced.
Borgesius concludes
... People’s choices regarding privacy can be analysed using economic theory. Consent to behavioural targeting could be seen as a trade-off: people often consent to a company processing their personal data in exchange for the use of a “free” service. However, information asymmetries hinder meaningful decisions.
Many people don’t realise that their online behaviour is tracked. If somebody doesn’t realise releasing personal data in exchange for the use of a “free” service, that “choice” can’t be informed. But even if companies asked people consent for behavioural targeting, information asymmetry problems would remain. First, people often don’t know what a company will do with their personal data. Second, if people knew, it would be hard to predict the consequences of future data usage. Third, people don’t know the value of their personal data, so they don’t know how much they “pay”. In sum, making meaningful decisions about behavioural targeting is hard for people because of a lack of information.
Because of transaction costs, like the time it would take to inform oneself, the information asymmetry problem is hard to solve. Reading privacy policies would cost too much time, as they tend to be difficult to read and long. Some suggestions were made to mitigate the information asymmetry problems.
First, there’s a need for education about behavioural targeting and online privacy in general. People can’t really choose if they don’t understand the question. Second, data protection law must be applied more vigorously. Companies that seek consent must do so in clear and straightforward language. Third, research is needed into better ways of presenting information to people. But even if all these measures were taken, considerable information asymmetries would probably remain. If people are asked to consent to data collection hundreds of times per day, even simple requests are overwhelming.
Moreover, insights from behavioural economics suggest that even fully informed people face problems making privacy choices in their own best interests. Many biases influence our decisions. For instance, people are myopic and tend to discount disadvantages in the future. If people can only use a service if they “consent” to behavioural targeting, they might ignore the costs of possible future privacy infringements, and choose for immediate gratification. Furthermore, people tend to stick with the default. Many other biases influence privacy decisions.

Data protection law has answers to only some of these problems. If consent would be implemented as requiring affirmative action of the data subject (an opt-in system), the status quo bias would nudge people towards privacy friendly choices. But myopia suggests that if the use of a service is made dependent on consenting to behavioural targeting, many people might consent, contrary to their own stated interests. The framing effect suggests that people can be pushed towards decisions that they might later regret. In sum, insights from behavioural economics cast doubt on the effectiveness of informed consent as a privacy protection measure. Many people click ‘I agree’ to any statement that is presented to them.
So what should the law do? A rather blunt reaction to myopia could be: prohibit companies from making the use of a service dependent on consenting to tracking. But sector-specific rules that prohibit certain behavioural targeting practices are also possible. However, prohibitions to protect people against themselves reek of unwarranted paternalism. On the other hand, it could be argued that some prohibitions would protect society as a whole. Some examples of possible prohibitions were mentioned. For instance, the tracking of children for behavioural targeting could be prohibited. Or it could be prohibited for online news services to engage in behavioural targeting. The examples show that it wouldn’t be easy to agree on prohibitions.
Lastly, there might be a middle ground. Instead of introducing prohibitions, the lawmaker could use insights from behavioural economics. The law could set defaults, and make them stickier by adding transaction costs. For instance, the law could set formal requirements for consent, like a minimum of five mouse clicks, or a letter by registered mail. Such measures would leave freedom of choice intact, at least formally, but the status quo bias in combination with transaction costs would steer people towards privacy. When new rules are adopted, it can’t be ruled out that some services that rely on income from behavioural targeting couldn’t be offered for “free” anymore. This should be taken into account.
In sum, the lawmaker has a range of options. There will probably always be a large category of cases where relying on informed consent, in combination with data protection law’s other safeguards, is the appropriate approach. For those cases, transparency and consent should be taken seriously. More effective ways of presenting information are needed. But this isn’t enough. Merely relying on data protection law to protect people’s privacy in the context of behavioural targeting doesn’t seem sufficient. If we decide, after debate, that it’s better for our society if certain practices don’t happen, prohibitions may be the best answer.