The so-called “right to be forgotten” has been put firmly on the agenda, both of academia and of policy. Although the idea is intuitive and appealing, the legal form and practical implications of a right to be forgotten have hardly been analysed so far. This contribution aims to critically assess what a right to be forgotten could or should entail in practice. It outlines the current socio-technical context as one of Big Data, in which massive data collections are created and mined for many purposes. Big Data involves not only individuals’ digital footprints (data they themselves leave behind) but, perhaps more importantly, also individuals’ data shadows (information about them generated by others). And contrary to physical footprints and shadows, their digital counterparts are not ephemeral but persistent. This presents particular challenges for the right to be forgotten, which are discussed in the form of three key questions. Against whom can the right be invoked? When and why can the right be invoked? And how can the right be effected? Advocates of a right to be forgotten must clarify which conceptualisation of such a right they favour – a comprehensive, user-control-based right to have data deleted in due time, or a narrower, context-specific right to a “clean slate” – and how they think the considerable obstacles presented in this paper can be overcome, if people are really to be enabled to have their digital footprints forgotten and to shun their data shadowsKoops comments that -
Looking at the world of Big Data we live in, I tend to believe that the data-deluge genie is out of the bottle. No matter how important the ideal of informational self-determination may be, users will not be able to put it back again. I doubt whether there is sufficient policy urgency in Europe to substantially change data-protection law to give data subjects a full-blown right to have data deleted, and to simultaneously mandate the forgetfulness-by-design that is required to make a right to be forgotten in any way meaningful. However, scholars and policy-makers with a different outlook may feel differently, and aim for devising legal and technical solutions that can address the challenges I outlined for a user-controlled right to be forgotten.The same issue of SCRIPTed features 'India’s New Data Protection Legislation: Do The Government’s Clarifications Suffice?' [PDF] by Raghunath Ananthapur, who criticises the Data Privacy Rules of 24 August 2011 issued by the Department of Information Technology. The Rules apply to 'sensitive data' of any individual collected, processed, or stored in India via a "computer resource" by a body corporate located in India.
In any case, it is clear that a generic right to be forgotten does not currently exist. There are flavours of such a right in current data protection and sectoral “clean-slate” laws, but the first are limited in strength, the second are limited in scope. Given the different possible conceptualisations and their different foci, anyone who advocates the establishment of a full-blown right to be forgotten must clarify what this right means and how it can be effected. As argued in this paper, considerable obstacles need to be overcome if people are really to be able to have their digital footprints forgotten and to shun their data shadows.
Ananthapur comments that the Department has "sent positive signals by reacting quickly to the Indian outsourcing industry’s concerns by publishing clarifications to the Data Privacy Rules". The clarifications, "while they will certainly benefit the Indian outsourcing industry" are :half baked, and appear to have had, as the objective, exempting third party Indian outsource providers from the compliance with the most controversial provision – 'consent conditions'". Quite so.
In the UK the background briefing for the Protection of Freedoms Bill, which among other matters deals with restraints on biometrics, notes the 'opt in' to disregard convictions for consensual same sex activity (i.e. what has been decriminalised over the past 30 years, albeit might still be addressed under 'public order' and 'offensive behaviour' statutes in the UK and Australia).
The briefing states that -
Chapter 4 of Part 5 contains provisions that will allow individuals with a conviction or caution for an offence under section 12 (buggery) or 13 (gross indecency between men) of the Sexual Offenders Act 1956 (or the corresponding earlier offences or military service offices), involving consensual gay sex with another person aged 16 or over, to apply to the Home Office to have details of that conviction or caution disregarded.
Consensual sex between men over the age of consent was decriminalised in 1967. Then the age of consent was 21 years, but it was lowered to 18 years in 1994 and to 16 years in 2000. However, details of any historic convictions for consensual gay sex with over 16s continue to be recorded on police records and appear on a CRB criminal record certificates.
If an application to have a conviction or caution disregarded is granted, the details of that conviction or caution will be removed from the Police National Computer, and any local police or other records, and will no longer be revealed on a CRB certificate. In addition, a person with a disregarded conviction or caution will not have to disclose that conviction or caution to anyone under any circumstances, for example, on a job application or in court proceedings.
There are estimated to be some 50,000 convictions and cautions recorded on the Police National Computer for section 12 and 13 offences; some 16,000 of these are estimated to relate to behaviour that is now decriminalised.