United Kingdom: Police in Essex introduce live facial recognition
Police in the county of Essex plan to deploy live facial recognition on a regular, ongoing basis. The force had used the technology on a trial basis before now. Earlier this year the Conservative government had announced plans to invest millions in facial recognition systems – and drawn criticism from civil liberties groups.
Live facial recognition operates by having cameras capture images of any and all passersby. The images are automatically compared against a list of wanted individuals. If the system identifies a potential match, officers are informed and must decide on the spot what further actions to take. The Information Commissioner’s Office (ICO), Britain’s data protection authority, in 2021 brought attention to the potential risk of massive amounts of biometric data being collected by live facial recognition systems. Biometric data is considered particularly sensitive, because it cannot be changed easily and can be used to identify a person throughout their life.
In Essex, the cameras will be placed on marked police vehicles and used in public places. These mobile units have been in use for some time in other parts of the UK, for example in London.
Police in Essex want to use the technology to “catch wanted criminals,” Chief Constable Ben-Julian Harrington said in an interview with the BBC. The system will be used, he said, only for “really serious” crimes.
According to Harrington, “It is always an officer who makes the decision whether someone should be arrested or not.” The system is supposed to “automatically and immediately” delete data from people who are not wanted after comparing them against the databank.
Criticism from civil liberties groups
The Essex police force first tested the technology in public places last year. Now the force is set to begin regular use by the end of this year, Harrington told the BBC. A facial recognition system is already in use that retrospectively compares footage from surveillance cameras with images in a police databank.
The British civil liberties organization Big Brother Watch compared the use of live facial recognition in public places to a “digital police line-up.” The technology is “dangerously imprecise” and represents a serious threat to privacy and civil liberties.
Critics have repeatedly pointed out that facial recognition is unreliable. In London for example researchers from the University of Essex accompanied police during trial runs and determined that the system had a failure rate of 81 percent.
Scant legal basis
The same researchers also pointed out that there was no legal basis for the use of facial recognition systems.
In January of this year the Justice and Home Affairs Committee of the House of Lords found that the deployment of real-time facial recognition “lacks a clear legal foundation.” In a letter to the Home Secretary, the committee wrote that it “accepts that [live facial recognition] may be a valuable tool for police forces, but we are deeply concerned that its use is being expanded without proper scrutiny and accountability.” The chair of the committee, Baroness Hamwee, said: “We question why there is such disparity between the approach in England and Wales and other democratic states in the regulation of [live facial recognition].”
Nevertheless, the Conservative government that was voted out of office this month had planned to expand the use of facial recognition: In fall 2023 Policing Minister Chris Philp encouraged police to deploy live facial recognition on a larger scale.
In April of this year the government announced plans to invest more than 50 million pounds in facial recognition systems. A portion of the funds were earmarked for “mobile units” that could be deployed in shopping districts.
Big Brother Watch has called live facial recognition a dystopian instrument of mass surveillance, and criticized the planned spending as “an abysmal waste of public money.”
Legal challenges against police
The civil liberties organization is currently backing two lawsuits prompted by misidentification by facial recognition – one of the lawsuits is directed against London’s Metropolitan Police.
In the London case, a 38-year-old man was on his way home when he was falsely identified as a wanted individual by the facial recognition system run by the police. Officers detained the man for nearly half an hour and threatened him with arrest – even though he was able to show identification.
The second case concerns the use of the controversial technology by retailers.
Silkie Carlo, director of Big Brother Watch and a co-claimant in the case against the London police, said when the lawsuit was filed: “Facial recognition is inaccurate and dangerously out of control in the UK. No other democracy in the world spies on its population with live facial recognition in the cavalier and chilling way the UK is starting to.” (js)