The UK High Court has dismissed a judicial review case against the Metropolitan Police’s use of live facial-recognition (LFR) technology, ruling there are sufficient constraints in place to prevent abuse and ensure compliance with human rights law.
Brought by anti-knife campaigner Shaun Thompson, who was wrongfully identified by the Met’s system and subject to a prolonged stop as a result; and Silkie Carlo, the director of privacy group Big Brother Watch, the landmark legal challenge argued there are no meaningful constraints on how the Met can deploy the technology.
In particular, their challenge hinged on the argument that the Met’s policy on where it can be deployed and who it can be used to target is so permissive, and leaves so much discretion to the force, that it cannot be considered “in accordance with law”.
However, the High Court ultimately agreed with the Met’s lawyers that “the Policy contains adequate and lawful constraints” over how and where the technology can be used.
While the Court of Appeal ruled in August 2020 that the use of LFR by South Wales Police was unlawful because the policy in place left excessive discretion in the hands of the force, the High Court found that in the Met’s case, its policy contained clear deployment criteria that effectively prevents individual officers from acting on “whim, caprice, malice or predilection”.
Although Thompson and Carlo argued that the Met’s policy could lead to disproportionate deployment rates in areas with large ethnic minority communities, the court said it “heard no developed or meaningful challenge on discrimination grounds” that would allow it to accept this argument.
It added although a properly evidenced discrimination challenge may succeed if the policy has “the effect of discriminating against a section of the community”, this point was only “faintly asserted” by the claimant’s lawyers.
On the potentially chilling effect LFR use could have on protest, the court added that the Met’s policy “recognises and deals with the risk of a chilling effect on aspects of public life… [and] acts as an effective safeguard against arbitrary outcomes”.
Judgment responses
Thompson, however, said he plans to appeal the outcome “to protect Londoners from facial recognition being used for mass surveillance and leading to situations like mine”, in which he was misidentified, detained and threatened with arrest.
“No one should be treated like a criminal due to a computer error,” he said. “I was compliant with the police but my bank cards and passport weren’t enough to convince the police the facial recognition tech was wrong. It’s like stop and search on steroids. It’s clear the more widely this is used, the more innocent people like me risk being criminalised.
“My daily work getting knives off the streets with the Street Fathers proves we can keep London safe through community action, not by spying on the public with cameras that real criminals already know how to dodge.”
Carlo said that “the fight against live facial-recognition mass surveillance is far from over” despite the “disappointing” judgment, adding: “There has never been a more important time to stand up for the public’s rights against dystopian surveillance tech that turns us into walking ID cards and treats us like a nation of suspects.
“Innocent people deserve clear and strict protections from live facial-recognition cameras, which should be reserved for the most serious cases rather than used to scan millions of people, and that is what the appeal will seek to achieve.
“This legal challenge, which was made possible by concerned members of the public, has already led to a change in the Met’s facial-recognition policy and to a payment awarded to Mr Thompson who was misidentified by the tech and threatened with arrest.”
Responding to the court’s decision, Met commissioner Mark Rowley said it represented “a significant and important victory for public safety”. and “confirms that the Met is right” to be using the technology.
“The court has been clear: our use of live facial recognition is lawful and supported by strong safeguards. The judgment confirms that we are deploying this technology responsibly and with care,” he said in a statement.
“It shows that fairness, accuracy and accountability were part of the design from the beginning. It also recognises that the Met has strong oversight and safeguards in place. These include checks to ensure use is proportionate and that people’s rights – such as privacy and freedom of expression are protected in a way which does not breach human rights.”
He added that LFR technology will play a key role in the Met’s accelerating use of “smart policing tools”, which will help to make the best possible use of limited resources.
“The courts have confirmed our approach is lawful. The public supports its use. It works. And it helps us keep Londoners safe. The question is no longer whether we should use live facial recognition – it’s why we would choose not to,” he said.
“Technology is advancing at record speed, and policing cannot afford to stand still – criminals won’t. Facial recognition is transformational for policing. Government and Parliament will want to carefully consider how they continue to enable, rather then over‑regulate, the use of technologies that help us prevent crime and protect the public as proven today.”
What’s next?
The ruling follows the Home Office announcing plans to ramp up its deployment of artificial intelligence and facial-recognition technologies under wide-ranging reforms to UK policing, and comes after the departments consultation on a dedicated legal framework for LFR and other forms of biometric identification.
While the use of LFR by police – beginning with the Met’s deployment at Notting Hill Carnival in August 2016 – has already ramped up massively in recent years, there has so far been minimal public debate or consultation, with the Home Office claiming for years that there is already “comprehensive” legal framework in place.
At the start of its LFR consultation – which the Home Office is still yet to formally respond to – the department said although a “patchwork” legal framework for police facial recognition exists (including for the increasing use of the retrospective and “operator-initiated” versions of the technology), it does not give police themselves the confidence to “use it at significantly greater scale … nor does it consistently give the public the confidence that it will be used responsibly”.
It added that the current rules governing police LFR use are “complicated and difficult to understand”, and that an ordinary member of the public would be required to read four pieces of legislation, police national guidance documents and a range of detailed legal or data protection documents from individual forces to fully understand the basis for LFR use on their high streets.
Responding to the judicial review decision, Malcolm Dowden, a privacy expert with Pinsent Masons, said the decision would open the door to wider deployment of the tools by the authorities, especially in the context of the Home Office consultation.
“This case had been viewed as the first major challenge to deployment based on [the College of Policing’s authorised professional practice] APP guidance on using facial recognition,” he said. “Its rejection is likely to fuel increased use of automated facial recognition, not only in policing but also – following the recent Home Office consultation – in areas such as border and immigration control.”

