Police should be banned from using live facial recognition technology in all public spaces because they are breaking ethical standards and human rights laws, a study has concluded.
LFR involves linking cameras to databases containing photos of people. Images from the cameras can then be checked against those photos to see if they match.
British police have experimented with the technology, believing it can help combat crime and terrorism. But in some cases, courts have found against the way police have used LFR, and how they have dealt with infringements of the privacy rights of people walking in the streets where the technology has been used. There are also concerns about racial bias.
The report, from the Minderoo Centre for Technology and Democracy, at the University of Cambridge, says LFR should be banned from use in streets, airports and any public spaces – the very areas where police believe it would be most valuable.
The study examined three deployments of LFR, one by the Metropolitan police and two by South Wales police. Both forces told the Guardian they had made improvements and believed in the benefits of LFR.
The report author Evani Radiya-Dixit said: “We find that all three of these deployments fail to meet the minimum ethical and legal standards based on our research on police use of facial recognition.
“To protect human rights and improve accountability in how technology is used, we must ask what values we want to embed in technology and also move from high-level values and principles into practice.”
The report adds: “We have shown how police use of facial recognition fails to incorporate many of the known practices for the safe and ethical use of large-scale data systems. This problem moves well beyond the concern of bias in facial recognition algorithms.”
Inside UK law enforcement LFR is seen as potentially the next big crime-fighting innovation, on a par with the introduction of fingerprints. It potentially boosts the ability to locate an individual and track them.
Critics warn it could lead to abuses of human rights on a huge scale, including against rights such as protest and freedom of assembly.
Overseas and more authoritarian regimes, such as China, have used the technology as part of their suite of repressive tools.
The Met said the algorithm used had improved hugely in its accuracy with help from the National Physical Laboratory and input from the Defence Science and Technology Laboratory, with a false alerts rate of less than 0.08%