Automated facial recognition and policing: a Bridge too far?' by Joe Purshouse and Liz Campbell in (2021) Legal Studies comments
Automated facial recognition (AFR) is perhaps the most controversial policing tool of the twenty-first century. Police forces in England and Wales, and beyond, are using facial recognition in various contexts, from evidence gathering to the identification and monitoring of criminal suspects. Despite uncertainty regarding its accuracy, and widespread concerns about its impact on human rights and broader social consequences, the rise of police facial recognition continues unabated by law. Both the Government and the domestic courts were satisfied that police use of this technology is regulated adequately by existing statutory provisions regulating the processing of data and police surveillance generally. That is, until the recent judgment of the Court of Appeal in R (Bridges) v Chief Constable of South Wales Police and Others [2020] EWCA Civ 1058, where it was held that the respondent's use of AFR was unlawful. This paper provides an analysis of AFR, reflecting on the outcome of that case and evaluates its nuanced findings. We suggest that the judgment leaves considerable room for police AFR to continue with only minor, piecemeal amendment to the legal framework. Drawing on comparative experience and relevant socio-legal scholarship, we argue that the relatively unfettered rise of police facial recognition in England and Wales illuminates deeper flaws in the domestic framework for fundamental human rights protection and adjudication, which create the conditions for authoritarian policing and surveillance to expand.