British Government to Invest Millions in Facial Recognition

Facial recognition van in England
The use of facial recognition by police in Britain has been criticized for years. (Source: IMAGO / Offside Sports Photography)

The British government plans to invest more than 50 million pounds (64 million euros) in facial recognition systems. The Home Office made the announcement last Wednesday. Civil liberties groups criticize the move as a waste of public funds – and warn of consequences for citizens’ privacy.

The investment was announced by the Home Office as part of an effort to crack down on shoplifting and attacks on retail workers. The plan includes the increased use of facial recognition technology “to help catch perpetrators and prevent shoplifting in the first place,” the Home Office wrote.

According to the announcement, a total of 55.5 million pounds will be spent over the coming four years to enable police to deploy facial recognition. Four million are earmarked for “mobile units” that are equipped with live facial recognition capabilities and can be deployed to shopping districts. The units will be able to identify people wanted by the police – “including repeat shoplifters,” the Home Office stated.

Such units have been deployed in London for several years now. Criticism began as early as the testing phase, however: researchers from the University of Essex accompanied police during trial runs and assessed the system as having a failure rate of 81 percent.

The researchers also pointed out that there was no legal basis for the use of facial recognition.

Tracking the Movements of Repeat Offenders

Plans for the expanded use of facial recognition were announced along with plans for more severe penalties for serial shoplifters and other offenders. Committing assault against a retail worker will constitute a standalone criminal offense, punishable by up to six months in prison or an “unlimited” fine.

Repeat offenders can also be forced to wear devices that track their movements, and can be barred from visiting certain stores – facial recognition is meant to help identify violators.

Warnings of Mass Surveillance

Prime Minister Rishi Sunak stated that since 2010 violent crime has fallen in England and Wales. “Yet shoplifting and violence and abuse towards retail workers continues to rise.”

Pointed criticism of the government’s plans has come from the British civil liberties organization Big Brother Watch. Said the NGO’s director Silkie Carlo, “It is completely absurd to inflict mass surveillance on the general public under the premise of fighting theft.” Instead of providing police with the resources necessary to pursue criminals, Carlo said, the government was relying on them to walk in front of its cameras.

Carlo characterized the investment as “an abysmal waste of public money on a dangerously authoritarian and inaccurate technology that neither the public nor parliament has ever voted on.” It comes at a cost to the privacy and civil liberties of the people of Britain.

“Live facial recognition may be commonplace in China and Russian but these Government plans put the UK completely out of sync with the rest of the democratic world,” the head of Big Brother Watch stated.

More Deployment Scenarios Planned?

Meanwhile the use of facial recognition could expand even further beyond the parameters currently outlined. The British newspaper the Times, citing a government source, reported on April 5 that the government planned to present its strategy for use of the technology in the coming months. One idea under consideration is to equip cameras in train stations with the controversial technology.

In the fall of 2023 the Home Office announced a joint initiative with large retailers like Primark, Marks & Spencer, and the Co-op chain of supermarkets. The cooperative effort, dubbed “Pegasus,” includes the use of facial recognition: retailers will provide footage from their surveillance cameras to the police, who will compare it against their database. The project is financed in part by the companies themselves.

The plan has drawn substantial criticism. After it was announced, several human rights organizations demanded that the retailers withdraw from the project. They warned that the use of facial recognition technology could “amplify existing inequalities.”

Among the organizations’ criticisms was that facial recognition software more frequently misidentifies people with darker skin color, “meaning that already marginalized groups are more likely to be subject to an invasive stop by police, or at increased risk of physical surveillance, monitoring and harassment by workers” while shopping. (js)