Data watchdog the Information Commissioner’s Office (ICO) has criticised a Scottish council for using facial recognition technology in nine schools. The ICO said the technology was deployed in such a way that it had likely “infringed data protection law”.
In October 2021, concerns were raised that North Ayrshire Council were using facial recognition systems in schools in its region. The ICO says it commenced engagement with the council to establish whether the processing of data broke any of the data protection laws – the General Data Protection Regulation 2018 (GDPR) and the Data Protection Act 2018 (DPA).
In its response to North Ayrshire Council, published yesterday, the ICO says that while the technology could be deployed lawfully, the council had deployed it “in a manner that is likely to have infringed data protection law”
under UK GDPR.
“It is critical that [North Ayrshire Council] fully understands its obligations under data protection law in relation to the deployment of new technologies that process children’s special category biometric data,” wrote Ken Macdonald, head of regions at the ICO.
Are councils deploying facial recognition technology in schools?
In August 2021, North Ayrshire Council published a notice on its website – which has since been removed – stating that it would be using a new system called iPayimpact from the following October. The system would allow contactless meal payments, meaning that cash and cheques wouldn’t be required.
At the time, the council said that students could opt out of the system and use a PIN verification service. It also said that 97% of children or their parents had given consent for the new system, the Guardian reported.
According to the tender, North Ayrshire Council were planning to extend the system to nine academies, 49 primary schools and one additional support needs school. They also referenced applying it in a number of ‘hospitality locations and outside catering units’ if the solution permitted.
The contract for the system was awarded to CRB Cunninghams. It said it could cut the average transaction time for a pupil paying for their lunch to five seconds.
However, when the ICO stepped in to investigate the use of facial recognition technology, the use of the system paused.
ICO warns of risks in using facial recognition around children
In its letter to North Ayrshire Council, the ICO made it clear that several articles of the GDPR had been infringed by the use of facial recognition technology including those relating to transparency, the right of people to be informed their data is being collected, and data retention.
It also recommends improvements that the council could make when “considering similar issues in the future”, which included taking into account data minimisation and data accuracy.
“New technologies such as facial recognition can offer benefits and efficiencies, but their use is not without risk from a data protection point of view,” explained Macdonald. “That risk is heightened where children’s data is being processed.”
He added that UK GDPR makes it clear that children are to receive “specific protection when processing their personal data”. This is because “they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data”.
Valid consent was unlikely to be met by North Ayrshire Council
The ICO has flagged that the council needs to ensure that there is a “valid lawful basis for processing children’s data”.
“Our view is that consent was the appropriate lawful basis for processing children’s special category biometric data for the purpose of cashless catering in this case,” explains Macdonald. “However, the requirements for valid consent were unlikely to have been met in this case.”
He goes on to write that North Ayrshire Council needs to explain in age-appropriate language how children’s data will be collected, used, stored and retained: “The risks associated with its use should be clearly set out.”
Civil liberties groups speak out in support of ICO’s decision
Civil liberties and privacy campaigning organisation Big Brother Watch told Tech Monitor that the ICO’s notice shows that the council had “failed in its obligation to protect children’s data rights”.
“This should act as a warning to other schools considering the introduction of this intrusive technology,” says Madeleine Stone, legal and policy officer, Big Brother Watch. “Children should not be subject to biometric identity checks while queuing for their lunch. This airport security-style technology has no place in schools.”
However, Mariano delli Santi, legal and policy officer, Open Rights Group, says that its disappointing that the ICO didn’t include considerations around facial recognition technology itself: “According to UK data protection law, whether something is appropriate, effective or proportionate are questions that organisations are required to consider and answer before the deployment of a new system is even planned, not as an afterthought,” he says.
“Facial recognition is a highly intrusive technology, and the frivolous decision of exposing pupils to the level of risk associated with the use of biometric data to solve an arguably inexistent problem, such as enabling cashless payments in school, should represent a primary concern for our regulator.”
He continues that the absence of the considerations fails to provide clarity to organisations and assurances to the public: “Innovation requires trust, and why should we trust innovation if it exposes us to needless risks and create new problems instead of solving them?”
A spokesperson for North Ayrshire Council said: “We welcome the clarity which has now been received from the Information Commissioner’s Office. Following the initial interest of the Commissioner’s Office in October 2021, we immediately ceased use of the facial recognition system and thereafter deleted all biometric data.”