The 56 page Big Brother Watch report
Face Off - The lawless growth of facial recognition in UK policing comments
Facial recognition has long been feared as a feature of a future authoritarian
society, with its potential to turn CCTV cameras into identity checkpoints,
creating a world where citizens are intensively watched and tracked.
However, facial recognition is now a reality in the UK – despite the lack
of any legal basis or parliamentary scrutiny, and despite the significant
concerns raised by rights and race equality groups. This new technology
poses an unprecedented threat to citizens’ privacy and civil liberties, and
could fundamentally undermine the rights we enjoy in public spaces.
Police forces in the UK have rolled out automatic facial recognition at a pace
unlike any other democratic nation in the world. Leicestershire Police, South
Wales Police and the Metropolitan Police have deployed this technology at
shopping centres, festivals, sports events, concerts, community events – and
even a peaceful demonstration. One police force even used the surveillance
tool to keep innocent people with mental health issues away from a public
event.
In this report, we explain how facial recognition technology works, how it
is being used by police in the UK, and how it risks reshaping our rights. We
are seeking to raise awareness of this growing issue with parliamentarians
and inform the wider public about what is happening behind the cameras.
In this report, we:
• Reveal new statistics following a series of freedom of
information requests, exposing the shocking inaccuracy and
likely unlawful practices within a number of police forces
using automated facial recognition;
• Analyse the legal and human rights implications of the
police’s use of facial recognition in the UK;
• Review the evidence that facial recognition algorithms often
disproportionately misidentify minority ethnic groups and
women;
• Present guest contributions from allies worldwide warning
about the impact of facial recognition on rights, including
contributions from representatives of American Civil Liberties
Union, Electronic Frontier Foundation, Georgetown Privacy
Centre, and the Race Equality Foundation;
We conclude by launching our campaign against the lawless growth of facial
recognition in the UK, supported by rights groups, race equality groups,
technologists, lawyers and parliamentarians.
The report's key findings :
• The overwhelming majority of the police’s ‘matches’ using
automated facial recognition to date have been inaccurate.
On average, a staggering 95% of ‘matches’ wrongly identified
innocent people.
• Police forces have stored photos of all people incorrectly
matched by automated facial recognition systems, leading
to the storage of biometric photos of thousands of innocent
people.
Metropolitan Police
• The Metropolitan Police has the worst record, with less than
2% accuracy of its automated facial recognition ‘matches’ and
over 98% of matches wrongly identifying innocent members
of the public.
The force has only correctly identified 2 people using the
technology – neither of which was a wanted criminal. One of
those people matched was incorrectly on the watch list; the
other was on a mental health-related watch list. However, 102
innocent members of the public were incorrectly identified
by automated facial recognition.
• The force has made no arrests using automated facial
recognition.
South Wales Police
• South Wales Police’s record is hardly better, with only 9%
accuracy of its matches whilst 91% of matches wrongly
captured innocent people.
• 0.005% of ‘matches’ led to arrests, numbering 15 in total.
• However, at least twice as many innocent people have been
significantly affected, with police staging interventions with
31 innocent members of the public incorrectly identified by
the system who were then asked to prove their identity and
thus their innocence.
• The force has stored biometric photos of all 2,451 innocent
people wrongly identified by the system for 12 months in a
policy that is likely to be unlawful.
• Despite this, South Wales Police has used automated facial
recognition at 18 public places in the past 11 months –
including at a peaceful demonstration outside an arms fair.
Custody images
• Out of the 35 police forces that responded to our Freedom
of Information request, not one was able to tell us how many
photos they hold of innocent people in their custody image
database.