16 August 2010

Biometrics

A new paper by Juliet Lodge on 'Quantum Surveillance and 'Shared Secrets': A Biometric Step Too Far?' [PDF] comments that -
Biometrics are a feature of communication technologies (ICTs). Their disproportionate use and the lax and arbitrary way in which they are defined and implemented endangers values, norms and practices central to accepted conceptions in the EU27 of transparency, data protection and data privacy. Concern over the indiscriminate and growing use of biometrics for increasingly mundane and imprecise purposes results in a breach of the earlier intention to ensure their proportionate deployment based on the principle of necessity. Deviation from this is now justified by reference to loose arguments about the alleged 'certainty' that biometric identifiers bring to cutting risk, and so enhancing 'security', however that is defined.
The paper was presented at the CEPS 'Liberty & Security in Europe' event last month. It builds on Lodge's 2006 LIBE paper on biometrics.

She argues that -
There are at least five underlying problems in over-optimistic and unwarranted 'trust' in the technologies (ICTs). The first problem is that reliance on assumed technological 'certainty'’ encourages groupthink and reliance on automated decision-making that exacerbate arbitrariness, and risks of inequality, discrimination and disregard for human dignity. The second is that what I call 'quantum surveillance' is inevitable given the tendency to interpret all manner of things – behaviour, movement, relations, associational links and emotion, as a 'biometric'. The third is that the transformational impact of ICTs on society and governance proceeds without sufficient ethical, socio-legal or political control, public consent or public accountability. The fourth is that privacy by design and smart data functionalities to ensure that the ICTs themselves safeguard and reveal only what the data subject permits are not being introduced swiftly or securely enough. The fifth is that cost and efficiency criteria coupled with ignorance of ICTs leads those responsible for public procurement to rely on private industry and vested interests to the detriment of society and democratic accountability. Quantum surveillance results.
One response might be the "the tendency to interpret all manner of things ... as a 'biometric'" is misplaced and should be resisted as sloppy thinking, in the same way that we might be cautious about applying buzzwords such as 'quantum' that offer an ostensible authority but in practice often obscure rather than clarify. Quantum cornflakes and dogfood?

Lodge concludes that -
The broad definition of 'biometrics' should not be accepted as legitimate if a quantum surveillance state and society are to be avoided, if citizen privacy and data are to be protected, and if security in the wider sense is to be safeguarded.
What she describes as her key findings are that -
Quantum surveillance is happening without quantum leaps in ethical understanding, sociolegal and political controls and public accountability to ensure legitimacy, justice and the sustainability of democratic norms, values and practices.

• The intertwining of internal (AFSJ and internal market, including sustainable economy, environment and knowledge society) policies with external security presents significant challenges to innovative thinking. Disjointed policy-making insecuritises commodified citizens and states.

• Concern over the indiscriminate and growing use of biometrics for increasingly mundane and imprecise purposes results in a breach of the earlier intention to ensure their proportionate deployment for verifying and authenticating a person's claim to a specific, context-dependent identity.

• Technological innovation, and the way in which the EU member governments have accepted a definition of biometrics originating in the discourse of the US and its homeland security agenda, has led to an unthinking culture of biometricisation and commodification of citizens separate from legitimate border management intentions.

• Applications and policies using biometrics should be subject to stringent data protection risk assessment criteria.

• Biometricisation of citizens erodes the principle of citizen equality.

• Biometricisation and digital life should not be separated from e-governance and ICT use for social and commercial use in the public or private sector, or in joint public-private sector arrangements.

• The potential for biometrics to augment security needs to be revisited, and an effective and ethical EU privacy and personal data protection regime defined and enforced across governance and commerce.

• Biometrics must be recognised as a business opportunity and not simply accepted as an infallible tool for verifying identity. Commerce in biometrically verified and verifiable identities attracts commerce and criminal activity. Cybercrime is growing, and privacy respecting ICTs have a long way to go if citizen identity is to be better protected.

• Interoperability goals to boost the competitiveness of the knowledge society must cease to be separated from the discourse over securitising territorial borders.

• The implications for citizen and societal security from cybercrime and trade in e-identities needs urgent attention, legislation and preferably a uniform definition of what constitutes a 'crime' and the institution of common penalties based on EU standards, if pervasive insecurity is not to result from e-identity (mis)use.

• The overall risk is unintentional insecuritisation owing to the lag between ICT innovation and up-to-date regulatory frameworks compounded by lack of overarching common EU rules on data storage, sharing, slicing, etc., for diverse purposes that third parties can exploit to the potential detriment of citizen privacy and data subject integrity.

• There is an urgent need to review with the Fundamental Rights Agency the implications for citizens of ever more automated decision-making that affects their exercise of fundamental rights and Single Market freedom of movement (persons, services, goods and capital).