London police are set to deploy real-time facial recognition technology throughout the city; while the U.S. is also implementing the tech among other biometrics in the Hartsfield-Jackson Airport in Atlanta, making it the first in the U.S. to become fully biometric.
The Metropolitan Police announced that after testing facial recognition they have moved past the trial stage and are ready to permanently integrate the cameras into London. According to a report by the BBC, the cameras will be placed in locations popular with shoppers and tourists, like Stratford’s Westfield shopping center and the West End.
Each individual system will have its own “watch list” consisting of images of criminals wanted for serious and violent crimes.
Several human-rights groups have stated facial recognition is a worrying technology. One group, British human-rights group Liberty, called the move a “dangerous and sinister step.”
“This is a dangerous, oppressive and completely unjustified move,” Clare Collier, advocacy director at Liberty, said in a statement. “Facial-recognition technology gives the state unprecedented power to track and monitor any one of us, destroying our privacy and our free expression.”
This comes amid calls from politicians and campaigners in the UK to stop the police from using live facial recognition for public surveillance, BBC reported.
Facial recognition technology has shown numerous issues over the years such as racial bias. Other problems notable by Fight For The Future, which ran a campaign against implementing the technology at music venues, cited “dangers to their fans in the form of police harassment including — misidentification, deportation, arrests for outstanding charges during an event and drug use during an event, discrimination at their concerts, and fans in a permanent government database,” all very valid concerns.
Last year, Activist Post consistently reported numerous studies finding that the technology’s accuracy isn’t all it’s marketed to be. Then Big Brother Watch, a watchdog observing UK Metropolitan Police trials, stated the technology misidentified members of the public as potential criminals, including a 14-year-old black child in a school uniform who was stopped and fingerprinted by police.
In eight trials in London between 2016 and 2018, the technology gave “false positives” that wrongly identified individuals as crime suspects when an individual passed through an area with a facial recognition camera. The UK is now in the process of leaving the EU by the end of this month; the trial showed 96 percent of scans used by police to track watch list suspects were inaccurate, that’s a big deal!’