Facial recognition software was rolled out to offer the Champions League final. The only problem was that all the people the tech as criminals, weren’t.
Data published on the force’s website suggests that of the 170,000 people who arrived at Cardiff for a match between Real Madrid and Juventus, 2,470 were identified as potential criminals. However, the force has since conceded that 92%, or 2,297, of these supposed identifications were found to be wrong.
South Wales police point out that “no facial recognition system is 100% accurate” and that a number of arrests have been made as a result of the implementation of the program.
“Over 2,000 positive matches have been made using our ‘identify’ facial recognition technology, with over 450 arrests,” a spokesperson stated, omitting reference to the impact on those misidentified.
The police force blamed the percentage of false positives at the football final on “poor quality images” supplied by agencies, including UEFA and Interpol.
The South Wales police have had similar issues in other high-profile events.
Their figures revealed that 46 people were wrongly identified at an Anthony Joshua boxing match, and 42 false positives were registered at a rugby match between Wales and Australia in November.
A recent London police trial of facial recognition technology at a Six Nations rugby match had an even worse hit-rate, generating 104 “alerts”, of which 102 were false.
Criticism of inaccuracy
Civil liberties campaign group Big Brother Watch has been a vocal critic of the technology since its inception.
“Not only is real-time facial recognition a threat to civil liberties, it is a dangerously inaccurate policing tool,” the group says.
The group highlights the fact that the technology takes up a great deal of police time and resources, requiring them to sift through large numbers of images to look for an actual match.
It points out that facial scanners misidentified more people at one event than were correctly spotted in nine months of use.
“If you have technology that is not up to scratch and it is bringing back high returns of false positives then you really need to go back to the drawing board.”
Big Brother Watch has also criticised the technology for curtailing freedoms of assembly and free speech.
“In the UK, we have already seen how real-time facial recognition was shamefully deployed at a peaceful demonstration and used to identify individuals with mental health issues at a public event,” said Big Brother Watch lead researcher Jennifer Krueckeberg. “This shows that not only criminals but people who are perceived as troublemakers can easily be targeted.”
She believes a further risk is a facilitation of “politically motivated surveillance”, whereby citizens are prevented from protesting against the state and its agencies.
Such surveillance is already prevalent in China and even in the US – with the notorious COINTELPRO and similar programs being used since the 1950s to spy on non-violent protest groups.
It was also recently revealed that the FBI tracked black lives matter protesters for no other reason than their involvement in antiracism protests.
In a submission to the parliamentary joint committee on intelligence and security, the Human Rights Law Centre stated that both false positive and false negative results for facial recognition are more likely to result in the matching of ethnic minorities in Australia.
It cited studies that found facial recognition had “a bias towards the dominant ethnic group in the area in which it is developed”. The percentage of false positives and interactions increased for racial minorities.
Closer to home
This bill, introduced into Federal Parliament in February, would establish a national facial recognition system allowing police access to a hub linking all identification photo databases that could be matched with CCTV images.
While the UK database only contains images of citizens who have been arrested, the Australian scheme would contain all state and territory drivers licence photos, along with passport, visa and citizenship images.
These images could then be matched against live images of faces in crowds and public places.
Such technology has the potential to greatly increase the level of surveillance in Australia – a country which already has the most pervasive meta-data retention laws in the developed world.