UK: Cameras running emotion detection technology tested at train stations

Entrance to Manchester Piccadilly station
The tests reportedly did not include the use of facial recognition. (Source: IMAGO / Pond5 Images)

Thousands of people were subjected to algorithmic surveillance at several train stations in Great Britain, reports Wired magazine, citing documents obtained in response to freedom of information requests. Civil liberties advocates have now filed complaints with the British data protection authority.

According to the report, the system was tested at a total of eight train stations across Britain over the last two years – among them busy stations in London, Leeds, Glasgow and Manchester. Network Rail, the publicly owned company that owns most train stations in the UK and operates many of them, is responsible for the tests.

The stated purpose of the tests was to alert personnel in the case of certain incidents. According to Wired, the system was supposed to be able to recognize if people were trespassing on tracks or if platforms were overcrowded. But another goal was to recognize so-called “antisocial behavior” – including running, skateboarding or smoking.

Demographic information

But the system was also used to estimate the age and identify the sex of passengers – and to detect emotions such as “happy, sad, and angry.” Per Wired, the documents contained suggestions “that the data could be used in advertising systems in the future.” It is unclear however how extensively the emotion analysis technology was actually implemented, according to Wired’s reporting.

Automated emotion detection in particular has been the focus of much criticism. As the Information Commissioner’s Office (ICO), the British data protection authority, stated in late 2022, “Emotion analysis relies on collecting, storing and processing a range of personal data.” This carries with it a large discrimination risk.

Deputy Commissioner Stephen Bonner characterized these systems as “immature,” saying, “They may not work yet, or indeed ever.”

Past studies have shown that facial expressions can’t reliably be used to determine emotions. A person with a grim look on their face is not necessarily angry. In criticizing the technology, advocacy groups also point out that it is scientifically dubious.

New and old cameras

As Wired reports, one aspect of the tests at train stations involved the use of so-called smart CCTV cameras that can “detect objects or movements.” Another aspect involved footage from security cameras already in use being connected to analysis software – software reportedly developed by Amazon. Between five and seven cameras at each station were used for the tests.

The system being tested is meant to be capable of automatically alerting station staff in the case of certain incidents. Facial recognition technology, whose purpose is to identify individuals, was not used, according to Wired.

Network Rail would not respond to Wired’s questions – not even to say whether the technology was still in use. A spokesperson said only that Network Rail uses “a range of advanced technologies across our stations to protect passengers, our colleagues, and the railway infrastructure,” and added: “We always comply with the relevant legislation regarding the use of surveillance technologies.”

Criticism and Complaints

Jake Hurfurt at the British civil liberties organization Big Brother Watch, which filed the request to obtain the documents, told Wired, “The rollout and normalization of AI surveillance in these public spaces, without much consultation and conversation, is quite a concerning step.” The analysis of “passenger demographics” was particularly concerning, Hurfurt said.

Carissa Véliz at the University of Oxford said of the tests: “Systems that do not identify people are better than those that do, but I do worry about a slippery slope.” Said Véliz: “There is a very instinctive drive to expand surveillance. Human beings like seeing more, seeing further. But surveillance leads to control, and control to a loss of freedom that threatens liberal democracies.”

Véliz pointed to similar tests conducted at a station on the London Underground: at first people’s faces were blurred on the footage taken. Later however the approach changed: the faces of people who were suspected of fare dodging were rendered visible and the footage was kept.

Hurfurt of Big Brother Watch said in a statement that he had submitted a complaint in response to the tests at train stations. “Network Rail had no right to deploy discredited emotion recognition technology against unwitting commuters at some of Britain’s biggest stations.”

He added: “Technology can have a role to play in making the railways safer, but there needs to be a robust public debate about the necessity and proportionality of tools used.” Hurfurt warned that everyone’s privacy could be at risk – especially if the technology was misused. (js)