Site navigation

Met Police to Deploy ‘81% Inaccurate’ Live Facial Recognition Tech

Dominique Adams


facial recognition

The Metropolitan Police will start using live facial recognition cameras in London to help tackle “serious crime”. 

The Metropolitan Police said it will deploy live facial recognition (LFR) cameras across the capital city, despite major concerns over the accuracy of the technology and the legality of its use.

The cameras will be used in specific London locations with lists of people wanted for serious and violent crimes generated each time.

Police hope the tech will help tackle serious crime including violence, gun and knife crime, child sexual exploitation and terrorism.

Nick Ephgrave, the Met’s assistant commissioner, said the force was taking the LFR tech “operational” following ten trials across the capital.

“This is an important development for the Met and one which is vital in assisting us in bearing down on violence,” he said.

“As a modern police force, I believe that we have a duty to use new technologies to keep people safe in London. We are using a tried-and-tested technology, and have taken a considered and transparent approach in order to arrive at this point.”

While police say the cameras were able to identify 70% of suspects, an independent review found the accuracy to be considerably lower.

The Met’s senior technologist, Johanna Morley, claimed that a false alert on the system was only triggered 0.1% of the time, meaning only one in every 1,000 faces scanned were flagged mistaking as belonging to a suspect on a police watch list.

An independent report, commission by the Met and revealed by Sky News found that the tech was in fact 81% inaccurate.

Unlike the police use of fingerprints and DNA, which are strictly regulated, facial images are not subject to the same laws.

Privacy campaigners have said the roll out poses a “serious threat to civil liberties”. But both the Met and Information Commissioner’s Office have deemed the use of the technology lawful.


Silkie Carlo, the director of Big Brother Watch, told Sky News: “This decision represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the UK.

“It flies in the face of the independent review showing the Met’s use of facial recognition was likely unlawful, risked harming public rights and was 81% inaccurate.

“This is a breath-taking assault on our rights and we will challenge it, including by urgently considering next steps in our ongoing legal claim against the Met and the home secretary. This move instantly stains the new government’s human rights record and we urge an immediate reconsideration”.

It was recently revealed in a leaked document the EU is considering a ban on the use of the tech in public spaces until a regulatory framework can be formulated. Potentially this could impact the UK until article 72 of the withdrawal agreement, which EU data processing law would continue in the UK until the end of the transition period.

Dominique Profile Picture

Dominique Adams

Staff Writer, DIGIT

Latest News

%d bloggers like this: