United Kingdom: Faulty facial recognition prompts legal challenges

London police van equipped with facial recognition
The civil liberties organization Big Brother Watch criticizes the expanding use of facial recognition in Great Britain. (Source: IMAGO / Offside Sports Photography)

Two people in London and Manchester were misidentified by facial recognition software in separate incidents. Now they have brought legal challenges against London’s Metropolitan Police and Facewatch, a company that sells facial recognition systems to retailers. Big Brother Watch is supporting the legal actions and demands that the controversial technology be banned.

One of the claimants is 38-year-old Shaun Thompson, who works to combat youth violence with the Street Fathers initiative. As Big Brother Watch reports, he was on his way home from a volunteer shift when the facial recognition system used by the London police falsely identified him as a wanted individual.

Officers held him for almost 30 minutes. Big Brother Watch reports that the officers repeatedly tried to take his fingerprints and threatened him with arrest. This even though Thompson was able to provide multiple forms of identification showing that he was not the person the police were searching for.

Said Thompson, “They were telling me I was a wanted man, trying to get my fingerprints and trying to scare me with arrest, even though I knew and they knew the computer had got it wrong.” He was angry, Thompson said, at being treated “as though I was guilty.”

“I’m bringing this legal challenge because I don’t want this to happen to other people. Facial recognition is like stop and search on steroids and doesn’t make communities any safer. It needs to be stopped.”

Police deploying the technology in public spaces

For several years now London’s Metropolitan Police has deployed mobile units that are equipped with facial recognition. Even in the testing phase there was criticism, however: researchers from the University of Essex accompanied police during trial runs and determined that the system had a failure rate of 81 percent. The researchers also pointed out that there was no legal basis for the use of facial recognition.

Big Brother Watch likewise stresses that “police use of facial recognition is not enabled by any specific piece of legislation and has not been authorized by parliament.”

According to a report by the BBC, the use of the controversial technology by police in London continues to increase. In 2023 police used the technology 23 times – this year they have already used it 67 times. But there have also been instances of misidentification.

Facial recognition used to fight theft

Some retailers in the UK are also using facial recognition to identify shoplifters. According to Big Brother Watch, in February a 19-year-old whom the NGO is calling Sara was trying to shop at a Manchester location of the Home Bargains chain. The store’s live facial recognition system, which is operated by Facewatch, falsely identified her as a shoplifter. She was searched by staff, called a thief and thrown out of the store. Staff also told her that she was banned from various stores throughout the UK.

Sara said in a statement: “I have never stolen in my life and so I was confused, upset and humiliated to be labelled as a criminal in front of a whole shop of people.” Stores should be prohibited from using facial recognition, she said. She has brought suit against both Facewatch and Home Bargains.

According to Big Brother Watch, Facewatch has admitted that its software misidentified Sara. Throughout the UK, retailers like Southern Co-op, Flannels, and Sports Direct also use the technology in their stores. Big Brother Watch reports that Facewatch customers can share photos of persons they suspect to be shoplifters. This is problematic, the NGO says, because “facial biometric data is as sensitive as passport data.” This case is only the “tip of the iceberg” – more and more people are seeking help from Big Brother Watch after being misidentified.

Data protection law violations

Big Brother Watch first filed a complaint against Facewatch with the Information Commissioner’s Office (ICO) in 2022. The UK data watchdog found that the company had violated data protection laws on multiple fronts, including for unlawful data processing. Big Brother Watch however criticized the ICO’s finding that “no further regulatory action” was required.

According to an investigation by UK newspaper the Observer, officials at the Home Office interceded on the company’s behalf, pressuring the ICO to come to a favorable conclusion in their review.

Big Brother Watch urges ban

Silkie Carlo, director of Big Brother Watch and a co-claimant in the suit against the London police, said that Shaun Thompson and Sara’s experiences “are proof that facial recognition surveillance poses a real threat to the public’s rights and should be urgently banned.” Said Carlo, “These legal challenges are a landmark step towards protecting the public’s privacy and freedom from Orwellian live facial recognition.”

Carlo stated further: “Facial recognition is inaccurate and dangerously out of control in the UK. No other democracy in the world spies on its population with live facial recognition in the cavalier and chilling way the UK is starting to, and it is alarming that the Government is seeking to expand its use across the country.”

The British Home Office announced in April that it planned to invest millions in facial recognition. The money would be used in part to buy new mobile units for the police that could be equipped with live facial recognition and deployed in shopping districts.

In Fall 2023 the Home Office also announced a joint initiative with retailers, including large chains like Primark and Marks & Spencer, as well as the supermarket chain Co-op. The partnership includes the use of facial recognition: retailers are meant to provide footage from their surveillance cameras to the police, who will compare it against their database.

Human rights organizations have criticized facial recognition software’s tendency to more frequently misidentify people of color. Increased use of the technology could lead to already marginalized groups being stopped and searched by the police at disproportionately high levels, and to their being surveilled by retail workers while shopping. (js)