English school illegally uses facial recognition
A school in the English County of Essex violated data protection laws by using facial recognition in its canteen. The school had not created a data protection impact assessment in advance as was required. The decision was made after an investigation by the British Information Commissioner’s Office (ICO) and a warning has been issued to the school.
In March 2023, the Chelmer Valley High School in Chelmsford began using facial recognition technology in connection with a payment system in their canteen. Prior to this, the school had already used fingerprint scanners so that approximately 1,200 students at the school could pay for their lunch without cash.
The school intended to reduce queues for serving meals with the help of facial recognition, as reported by The Telegraph.
Risks not analyzed
As stated on Tuesday by British data protection authorities, the use of facial recognition was, however, illegal. This is due to the school not conducting a data protection impact assessment beforehand. The authorities explain that the data risks for those children affected could not be evaluated as a result.
When using facial recognition, biometric data is processed with which people can be clearly identified. According to the British authorities, such data is considered particularly sensitive. Processing such data therefore involves high risks. An impact assessment is necessary to determine and evaluate the corresponding risks.
Lynne Currie from the ICO said: “Handling people’s information correctly in a school canteen environment is as important as the handling of the food itself. We expect all organisations to carry out the necessary assessments when deploying a new technology to mitigate any data protection risks and ensure their compliance with data protection laws.”
Currie continues to explain that a data protection impact assessment (DPIA) is required by law and an important instrument for protecting the rights of those affected. By doing so, it makes organisations already consider data protection at the beginning of a project.
Opting-out instead of opting-in
Furthermore, the authorities criticize that the school did not first obtain valid consent for processing data. Parents of students were informed in writing about the new technology in March 2023 and had the opportunity to object. However, according to the data protection authorities, it is required by law to obtain explicit consent. The so-called “opt-out” was not legitimate.
Additionally, most of the students affected were old enough to provide their own consent. Objections made by parents would prevent them from exercising their rights and freedom. Parents or students were not included beforehand in the decision-making process about the use of the technology.
The school did, however, retroactively obtain consent from those affected in November 2023. A data protection impact assessment was also conducted at this point in time. Regardless, it was a violation of data protection laws as both aspects are required to be met beforehand. The authorities considered the measures that had been taken by the school prior to this point in time and simply issued a warning. If the school were to violate data protection laws again in the future, other actions may be taken.
Currie said: “We’ve taken action against this school to show introducing measures such as FRT should not be taken lightly, particularly when it involves children.”
The authorities do not want to discourage schools from using new technologies. However, data protection needs to be a top priority to create trust and to protect the privacy of children.
The authorities also issued a legally non-binding recommendation to the Chelmer Valley High school. They should, for example, follow the guidelines of the authorities for video surveillance. Part of this is also a case study of facial recognition at schools. In this study, they warn that the use of facial recognition poses a particular risk for discrimination. Therefore, schools must check if there are alternatives that interfere less with the rights of those affected.
Criticism of identity checks similar to border-control
Mark Johnson from the British civil rights organisation, Big Brother Watch, criticized with regard to the current case: “The faceprints taken by these systems contain highly sensitive biometric data. No child should have to go through these kind of border-style identity checks just to get a school meal.” He demanded: “Children should be taught how to look after their personal data, not treated like walking bar-codes by their own schools and encouraged to give away their biometric data on a whim.”
According to details from the BBC, many British schools are using payment systems which work with fingerprint-reading devices. The use of facial recognition is still uncommon.
For example, in 2021 it was made public that nine schools in North Ayrshire (Scotland) had facial recognition installed in their canteens. In response, an organization specializing in children’s digital rights, Defend Digital Me, called for a ban on the use of biometric recognition technology in schools. The technology excessively interferes with childrens’ rights to have their privacy protected and is not necessary in a democratic society.
Eventually, the school discontinued the project. The data protection authorities intervened at the time. After an investigation in 2023, the data protection authorities declared that the implementation in the school likely violated privacy laws. (js)