15 August 2018

Advocacy and Aadhaar

'Let’s be careful out there … : how digital rights advocates educate citizens in the digital age' by Efrat Daskal in (2018) 21(2) Information, Communication and Society 241-256 comments
From the early days of the printed press, citizens have challenged and modified the information environment as constructed by governments and media organizations. In the digital era, this struggle is manifested in the work of civil-society organizations calling to expand the boundaries of digital rights such as access to the internet, freedom of speech, and the right to privacy. Alongside their traditional activity of confronting governments and internet organizations, these bodies have also engaged in educating citizens about their rights. In order to shed light on such educational efforts, I examine the activities of four civil-society organizations operating in three countries (Germany, Israel, and the U.S.) by conducting a content analysis of their websites between 2013 and 2015. The results suggest that the organizations’ interactions with the public are guided by three main principles: (1) cultural informational framing: delivering accurate technological and political information, which is framed so as to resonate with the cultural premises and everyday lives of the target audiences; (2) personal activism: propelling citizens toward participation, primarily through political clicktivism and by providing them with technological guidance and tools for digital self-protection; and (3) branding digital rights activism: fostering a unique image for a particular organization’s digital rights activism, mostly through selling merchandise to citizens. Using these strategies, the organizations aim to construct the social–political–cultural identity of a generation who are knowledgeable, politically active, and aware of their rights in the digital age. The characteristics of this identity are discussed in the conclusion.
In India the report of the Srikrishna Committee on data protection, formally A Free and Fair Digital Economy Protecting Privacy, Empowering Indians by the Committee of Experts under the Chairmanship of Justice B.N. Srikrishna, has highlighted concerns regarding India's weak data protection regime.

The Comittee has separately provided a draft data protection Bill, of particular importance given ongoing rollout of the Aadhaar biometric identity card.

The report states
This report is based on the fundamental belief shared by the entire Committee that if India is to shape the global digital landscape in the 21st century, it must formulate a legal framework relating to personal data that can work as a template for the developing world. Implicit in such a belief is the recognition that the protection of personal data holds the key to empowerment, progress, and innovation. Equally implicit is the need to devise a legal framework relating to personal data not only for India, but for Indians. Such a framework must understand from the ground up the particular concerns and aspirations pertaining to personal data shared by Indians, their fears and hopes. It is a platitude that such viewpoints may not necessarily be the same in developed countries, which already have established legal frameworks. The report thus ploughs its own furrow, responding to the challenges that India faces as a developing nation in the Global South. At the same time, it adopts learnings from best practices that exist in developed democracies with considerably advanced thinking on the subject. 
A. Existing Approaches to Data Protection 
In today‘s world, broadly three approaches to data protection exist. The US follows a laissezfaire approach and does not have an overarching data protection framework. US courts however, have collectively recognised a right to privacy by piecing together the limited privacy protections reflected in the First, Fourth, Fifth and Fourteenth Amendments to the US Constitution. Consequently, certain legislation, the Privacy Act, 1974, the Electronic Communications Privacy Act, 1986 and the Right to Financial Privacy Act, 1978 protect citizens against the federal government. With regard to the private sector, while no omnibus legislation exists, it has sector-specific laws that have carefully tailored rules for specific types of personal data. For example, the GLB Act has well-defined provisions for collection and use of financial data. 
The EU, at the vanguard of global data protection norms has recently enacted the EU GDPR, which has come into force on 25 May 2018. This replaces the Data Protection Directive of 1995. It is a comprehensive legal framework that deals with all kinds of processing of personal data while delineating rights and obligations of parties in detail. It is both technology and sector-agnostic and lays down the fundamental norms to protect the privacy of Europeans, in all its facets. We are informed that 67 out of 120 countries outside Europe largely adopt this framework or that of its predecessor. 
Though the aforementioned approaches have dominated global thinking on the subject, recently, China has articulated its own views in this regard. It has approached the issue of data protection primarily from the perspective of averting national security risks. Its cybersecurity law, which came into effect in 2017, contains top-level principles for handling personal data. A follow-up standard (akin to a regulation) issued earlier this year adopts a consent-based framework with strict controls on cross-border sharing of personal data. 
It remains to be seen how such a standard will be implemented. Each of these regimes is founded on each jurisdiction‘s own understanding of the relationship between the citizen and the state in general, and the function of the data protection law, in particular. 
In the US, the laissez-faire approach to regulating data handling by private entities while imposing stringent obligations on the state is based on its constitutional understanding of liberty as freedom from state control.8 Data protection is thus an obligation primarily on the state and certain categories of data handlers who process data that are considered worthy of public law protection. In Europe on the other hand, data protection norms are founded on the need to uphold individual dignity. Central to dignity is the privacy of the individual by which the individual herself determines how her personal data is to be collected, shared or used with anyone, public or private. The state is viewed as having a responsibility to protect such individual interest. China, on the other hand, frames its law with the interests of the collective as the focus, based on its own privileging of the collective over the individual. 
B. Understanding the Contours of the Indian Approach 
Each of these legal regimes described above has acceptability in its respective jurisdiction because it captures the zeitgeist of the citizen-state relationship that exists in each. At the same time, it is trite that neither is India‘s understanding of its citizen-state relationship, nor its motivations for a data protection law, exactly coincident with each of the aforementioned jurisdictions. The conceptualisation of the state in the Constitution is based on two planks — first, the state is a facilitator of human progress. Consequently, it is commanded by the Constitution in Part IV (Directive Principles of State Policy) to serve the common good; second, the state is prone to excess. Hence it is checked by effectuating both a vertical (federal structure) and horizontal (three organs of government) separation of powers, as well as by investing every individual with fundamental rights that can be enforced against the state. 
The right to privacy has been recently recognised as a fundamental right emerging primarily from Article 21 of the Constitution, in Justice K.S. Puttaswamy (Retd.) v. Union of India
To make this right meaningful, it is the duty of the state to put in place a data protection framework which, while protecting citizens from dangers to informational privacy originating from state and non-state actors, serves the common good. It is this understanding of the state‘s duty that the Committee must work with while creating a data protection framework. The TORs (annexed in Annexure A) mandate both a study of various data protection related issues in India along with specific suggestions for a data protection framework and a draft bill. This must be seen in light of the objective of the Government of India in setting up of the Committee, also contained in the TORs, ―to unlock the data economy, while keeping data of citizens secure and protected.‖ This objective appears to be based on the salient realisation that data has the potential to both empower as well as to harm. 
The transformative potential of the digital economy to improve lives in India and elsewhere, is seemingly limitless at this time. Artificial Intelligence holds out the promise of new breakthroughs in medical research and Big Data generates more calibrated searches and allows quicker detection of crime. Large-scale data analytics allows machines to discern patterns and constantly improves services in an endless virtual loop. The prospects of such data gathering and analysis to benefit citizens is immense. 
At the same time, the potential for discrimination, exclusion and harm is equally likely in a digital economy. The recent admission by Facebook that the data of 87 million users, including 5 lakh Indian users, was shared with Cambridge Analytica through a third-party application that extracted personal data of Facebook users who had downloaded the application as well as their friends, is demonstrative of several such harms - users did not have effective control over data. Further, they had little knowledge that their activity on Facebook would be shared with third parties for targeted advertisements around the US elections. The incident, unfortunately is neither singular, nor exceptional. Data gathering practices are usually opaque, mired in complex privacy forms that are unintelligible, thus leading to practices that users have little control over. Inadequate information on data flows and consequent spam or worse still, more tangible harms, are an unfortunate reality. Equally, the state collects and processes significant amounts of personal data of citizens, with much of such processing being related to its functions. Despite the fact that the State is able to exercise substantial coercive power, and despite ambiguous claims to personal data that may not be necessary for its functions, the State remains largely unregulated on this account. Currently, the law does little to protect individuals against such harms in India. The transfer of personal data (defined as ―sensitive personal data or information‖) is governed by the SPD Rules. 
The SPD Rules were issued under Section 43A of the IT Act which holds a body corporate liable for compensation for any negligence in implementing and maintaining reasonable security practices and procedures while dealing with sensitive personal data or information. The SPD Rules expand on the scope of these reasonable practices and procedures. They define sensitive personal data and mandate the implementation of a policy for dealing with such data. Further, various conditions such as consent requirement, lawful purpose, purpose limitation, subsequent withdrawal of consent, etc., have been imposed on the body corporate collecting such information. 
The SPD Rules require the prior consent of the provider of the information while disclosing sensitive personal data to a third party. Transfer of sensitive personal data outside India is permitted on the condition that the same level of data protection is adhered to in the country, which is applicable to the body corporate under the SPD Rules.22 The body corporate would further be deemed to have complied with reasonable security practices if it has complied with security standards and has comprehensive data security policies in place. 
While the SPD Rules were a novel attempt at data protection at the time they were introduced, the pace of development of the digital economy has made it inevitable that some shortcomings have become apparent over time. For instance, the definition of sensitive personal data is unduly narrow, leaving out several categories of personal data from its protective remit; its obligations do not apply to the government and may, on a strict reading of Section 43A of the IT Act be overridden by contract. The IT Act and SPD Rules have also suffered from problems of implementation due to delays in appointments to the adjudicatory mechanisms created under the IT Act. Some of these are not peculiarly Indian problems but endemic in several jurisdictions. 
The deficiencies in regulation of data flows in India (and elsewhere in the world) is a consequence of a simplistic assumption that data flows are an unadulterated good. This is only partially accurate. It is clear that several data flows can cause considerable harm. But more significantly, the treatment of free data flows as an intrinsic good, as the recent exposé of data sharing practices by Facebook demonstrates, has placed the interests of the individual in whose name the information flows, as secondary to the interests of companies of various kinds which deal with the data. This gives a different complexion to the terminology in various jurisdictions designating the individual whose data is being collected as the “data subject” and the entity that collects the data as the “data controller”. We begin by revisiting this terminology. 
C. Data Principals and Data Fiduciaries 
It is our view that any regime that is serious about safeguarding personal data of the individual must aspire to the common public good of both a free and fair digital economy. Here, freedom refers to enhancing the autonomy of the individuals with regard to their personal data in deciding its processing which would lead to an ease of flow of personal data. 
Fairness pertains to developing a regulatory framework where the rights of the individual with respect to her personal data are respected and the existing inequality in bargaining power between individuals and entities that process such personal data is mitigated. In such a framework, the individual must be the “data principal” since she is the focal actor in the digital economy. The relationship between the individual and entities with whom the individual shares her personal data is one that is based on a fundamental expectation of trust. Notwithstanding any contractual relationship, an individual expects that her personal data will be used fairly, in a manner that fulfils her interest and is reasonably foreseeable. This is the hallmark of a fiduciary relationship. In the digital economy, depending on the nature of data that is shared, the purpose of such sharing and the entities with which sharing happens, data principals expect varying levels of trust and loyalty. For entities, this translates to a duty of care to deal with such data fairly and responsibly for purposes reasonably expected by the principals. This makes such entities “data fiduciaries”. 
Pursuant to this, and as a general canon, data fiduciaries must only be allowed to share and use personal data to fulfil the expectations of the data principal in a manner that furthers the common public good of a free and fair digital economy. It is our considered view that a regime based on the principles mentioned above and implemented through the relations described above will ensure individual autonomy and make available the benefits of data flows to the economy, as mandated by the TOR. 
The twin objectives of protecting personal data while unlocking the data economy have often been seen as conflicting with each other. Specifically, the TOR which mandates both these objectives, is said to have set up a false choice between societal interests and individual interests, a trade-off between economic growth and data protection. 
It is argued that both are designed to achieve the constitutional objectives of individual autonomy, dignity and selfdetermination. In our view, ensuring the protection of personal data and facilitating the growth of the digital economy are not in conflict and has rightly been pointed out, serve a common constitutional objective. However, each of them is motivated by distinct intermediate rationales — the former ensuring the protection of individual autonomy and consequent harm prevention and the latter seeking to create real choices for citizens. Both these intermediate objectives themselves are complementary — individual autonomy becomes truly meaningful when real choice (and not simply an illusory notion of it) can be exercised and likewise no real choice is possible if individuals remain vulnerable. The growth of the digital economy, which is proceeding apace worldwide, must be equitable, rights-reinforcing and empowering for the citizenry as a whole. In this, to see the individual as an atomised unit, standing apart from the collective, neither flows from our constitutional framework nor accurately grasps the true nature of rights litigation. 
Rights (of which the right to privacy is an example) are not deontological categories that protect interests of atomised individuals; on the contrary, they are tools that as Raz points out, are necessary for the realisation of certain common goods. The importance of a right in this account is not because of the benefit that accrues to the rights holder but rather because that benefit is a public good that society as a whole enjoys. This is a critical distinction, and often missed in simplistic individual-centric accounts of rights. 
This is an argument made most forcefully by Richard Pildes. Pildes provides an example — in Pico v. United States, the question before the US Supreme Court was whether a decision by a school to ban certain books from the library on account of them being ―anti-American, anti-Christian, anti-Semitic and just plain filthy ― violated the right to free speech of the students under the First Amendment. The decision to strike down the ban, Pildes believes, is justified not because the free speech right ― in this case to receive information freely — is weightier than the state interest in promoting certain values in public education. Were this the case, it would be difficult to trammel the right to receive information freely at all. On the contrary, it was justified because the school could not remove books on the basis of hostility to the ideas that they contained — such reasons were illegitimate in this context where the common good is a public education system that differentiates politics from education. A decision on rights is thus a decision on the justifiability of state action in a given context that is necessary to serve the common good. 
Thus the construction of a right itself is not because it translates into an individual good, be it autonomy, speech, etc. but because such good creates a collective culture where certain reasons for state action are unacceptable. In the context of personal data collection, use and sharing in the digital economy, it is our view that protecting the autonomy of an individual is critical not simply for her own sake but because such autonomy is constitutive of the common good of a free and fair digital economy. Such an economy envisages a polity where the individual is autonomously deciding what to do with her personal data, entities are responsibly sharing such data and everyone is using data, which has immense potential for empowerment, in a manner that promotes overall welfare. 
Thus keeping citizens‘ personal data protected while unlocking the digital economy, as the TOR mandates, are both necessary. This will protect individual autonomy and privacy which can be achieved within the rubric of a free and fair digital economy. This is the normative framework that India, as a developing nation needs to assuredly chart its course in the increasingly digital 21st century. 
D. Following Puttaswamy 
This normative foundation of the proposed data protection framework is true to the ratio of the judgment of the Supreme Court of India in Puttaswamy
The Supreme Court held that the right to privacy is a fundamental right flowing from the right to life and personal liberty as well as other fundamental rights securing individual liberty in the Constitution. In addition, individual dignity was also cited as a basis for the right. Privacy itself was held to have a negative aspect, (the right to be let alone), and a positive aspect, (the right to selfdevelopment.) 
The sphere of privacy includes a right to protect one‘s identity. This right recognises the fact that that all information about a person is fundamentally her own, and she is free to communicate or retain it for herself. This core of informational privacy, thus, is a right to autonomy and self-determination in respect of one‘s personal data. Undoubtedly, this must be the primary value that any data protection framework serves. However, there may be other interests to consider, on which, the Court observed as follows: ―Formulation of a regime for data protection is a complex exercise which needs to be undertaken by the State after a careful balancing of the requirements of privacy coupled with other values which the protection of data sub-serves together with the legitimate concerns of the State.‖ 
Thus, like other fundamental rights, privacy too can be restricted in well-defined circumstances. For such a restriction, three conditions need to be satisfied: first, there is a legitimate state interest in restricting the right; second, that the restriction is necessary and proportionate to achieve the interest; third that the restriction is by law. As the excerpt from Puttaswamy above establishes, two points are critical — first, the primary value that any data protection framework serves must be that of privacy; second, such a framework must not overlook other values including collective values. In our view, the normative framework of a free and fair digital economy can provide a useful reference point for balancing these values in a particular case. To understand whether in a certain case, a right to privacy over that which is claimed exists, and would prevail over any legitimate interests of the state would depend on the interpretation by courts on how the needs of a free and fair digital economy can be best protected. It may happen by fully upholding the right, or alternatively finding the restriction justified, or a partial application of one or the other. The normative framework for this exercise is provided by the values of freedom and fairness. After all, freedom and fairness are the cornerstones of our constitutional framework, the raison d’etre of our struggle for independence. 
E. Chapters in the Report 
In order to ensure that a free and fair digital economy is a reality in India, there is certainly a need for a law that protects personal data. This report sets the framework for the contents of such a law and this could further be instrumental in shaping the discourse on data protection in the Global South. 
Chapter 2 is a discussion of fundamental questions relating to scope and applicability of such a law. The question of scope of data protection laws in different jurisdictions is vexed — seamless transferability of data across national boundaries, has, for some, eroded the importance of the nation state.40 While the factual premise of seamless transferability is largely correct, absent a global regulatory framework, national legislations supported by wellestablished conflicts of laws rules will govern issues relating to jurisdiction over personal data. In a legislation for India, questions of scope and applicability must be answered according to our policy objective of securing a free and fair digital economy. This objective will be severely compromised if data of Indians is processed, whether in India or elsewhere, without complying with our substantive obligations. Implicit in this is the ability of the state to hold parties accountable, irrespective of where data might have been transferred, and particularly to be able to enforce such obligations against errant parties. At the same time this objective cannot be enforced in derogation of established rules of international comity, respecting the sovereignty of other jurisdictions in enforcing its own rules. 
Chapter 3 deals with the processing of personal data. Consistent with our view that the digital economy should be free and fair, the autonomy of the individual whose data is the lifeblood of this economy should be protected. Thus, a primary basis for processing of personal data must be individual consent. This recommendation is not oblivious to the failings of the consent framework. Consent is often uninformed, not meaningful and operates in an all-or-nothing fashion. This chapter provides an alternate framework of consent that treats the consent form, not as a means to an end, but rather as an end in itself. This imposes form and substance obligations on entities seeking consent as well as more effective mechanisms for individuals to track and withdraw consent. 
Chapters 4 and 5 deal with obligations on data fiduciaries and rights of data principals. Anyone who uses personal data has an obligation to use it fairly and responsibly. This is the cardinal tenet of the proposed framework. We envisage the DPA and courts developing this principle on a case-by-case basis over time ensuring robust protection for individual data. At the same time, certain substantive obligations are critical if the objective of a free and fair digital economy is to be met. Specifically, these obligations ensure that the data principal is aware of the uses to which personal data is put and create bright line rules on when personal data can be collected and stored by data fiduciaries. This segues into Chapter 5 which deals with the rights of data principals. This is consistent with the principle that if the data principal is the entity who legitimises data flows, she must continue to exercise clearly delineated rights over such data. The scope of such rights, their limitations and their methods of enforcement are discussed in detail. 
The flow of data across borders is essential for a free and fair digital economy. However, such flows cannot be unfettered, and certain obligations need to be imposed on data fiduciaries who wish to transfer personal data outside India. At the same time India‘s national interests may require local storage and processing of personal data. This has been dealt with in Chapter 6. 
Chapter 7 discusses the impact of the proposed data protection framework on all allied laws which may either set a different standard for the protection of privacy or might otherwise authorise or mandate the processing of large amounts of personal data. Particularly, the impact on and necessary amendments to the IT Act, the Aadhaar Act and the RTI Act are discussed. 
There are situations where rights and obligations of data principals and data fiduciaries may not apply in totality. This manifests in limited instances where consent may not be used for processing to serve a larger public interest such as 'national security‘, 'prevention and investigation of crime‘, 'allocation of resources for human development‘, 'protection of the revenue‘. These have been recognised in Puttaswamy as legitimate interests of state. A discussion of such grounds where consent may not be relevant for processing is contained in Chapter 8. While some of the situations listed here only allow for processing without consent (non-consensual grounds), others are situations where substantive obligations of the law apply partially (exemptions). A critical element of this discussion relates to the safeguards governing such processing in order to prevent their wrongful use. Specific safeguards for both the grounds and the partial exemptions to the law are thus delineated together with the obligations that would continue to apply, notwithstanding such derogation from consent. Critical to the efficacy of any legal framework is its enforcement machinery. This is especially significant in India‘s legal system, which has often been characterised as long on prescriptions and short on enforcement. This requires careful redressal. To achieve this, enforcement of this law must be conceived as having both an internal and an external element. External enforcement requires the establishment of an authority, sufficiently empowered and adequately staffed to administer data protection norms in India. However, we are cognizant of the limitations of a single authority to enforce a law of such significant magnitude, irrespective of whether it has nation-wide presence and resources. Consequently, any internal aspect of enforcement implies the need to formulate a clear legislative policy on ex ante organisational measures. Such policy and measures are to be enforced by codes of practice to be developed in consultation with sectoral regulators, regulated entities and data principals, through an open and participatory process. Chapter 9 contains the details of the enforcement machinery under the proposed framework. 
The report concludes with a summary of recommendations that we would urge the Government of India to adopt expeditiously in the form of a data protection law. A suggested draft of such a law has been provided along with this report. 
F. Methodology 
While framing the report, the Committee has conducted wide consultations. A White Paper was published by the Committee on 27 November 2017 for public comments. In addition, four public consultations were conducted by the Committee in New Delhi on 5 January 2018, Hyderabad on 12 January 2018, Bengaluru on 13 January 2018, and Mumbai on 23 January 2018. A number of views were expressed both in the written comments submitted to the Committee as well as oral representations at the public consultations. As will be evident from this report, such views, together with further research, have significantly informed our work, often departing from tentative viewpoints that may have been presented in the White Paper. This demonstrates the participatory and deliberative approach followed by the Committee in the task before it. 
We are cognisant of the limitations of this report and lay no claims to exhaustiveness. The digital economy is a vast and dynamic space and we have consciously avoided wading into territories that do not strictly come within the framework of data protection issues set out in our TOR. Needless to say, such issues will have to be gone into at the appropriate time if our framework of a free and fair digital economy is to be truly upheld. Notably, these issues include those of intermediary liability, effective enforcement of cyber security and larger philosophical questions around the citizen-state relationship in the digital economy, all of which have been raised in public comments and committee meetings. Our deliberations have also raised questions related to non-personal data and emerging processing activities that hold considerable strategic or economic interest for the nation. Data processing is equally linked to the creation of useful knowledge, impinging values such as reliability, assurance and integrity. Many issues related to electronic communications infrastructure and services also arise in the larger context of the digital economy. We leave such questions to the wisdom of a future committee in the hope that they will be duly considered. 
G. Summary: A Fourth Way to Privacy, Autonomy and Empowerment 
In our view, a combination of the elements outlined above would deliver a personal data protection law that protects individual privacy, ensures autonomy, allows data flows for a growing data ecosystem and creates a free and fair digital economy. In other words, it sets the foundations for a growing, digital India that is at home in the 21st century. This is distinct from the approaches in the US, EU and China and represents a fourth path. This path is not only relevant to India, but to all countries in the Global South which are looking to establish or alter their data protection laws in light of the rapid developments to the digital economy. After all, the proposition that the framework is based on is simple, commending itself to universal acceptability — a free and fair digital economy that empowers the citizen can only grow on the foundation of individual autonomy, working towards maximising the common good.