LONDON — London’s police department said on Friday that it would begin using facial recognition technology in the city to identify people off the street in real time with video cameras, adopting a level of surveillance that is rare outside of China.
The decision is a major development in the use of a technology that has set off a worldwide debate about the balance between security and privacy. Police departments contend that the software gives them a technological edge to catch criminals that may otherwise avoid detection. Critics say the technology is an invasion of privacy and is being rolled out without adequate public discussion.
Britain has been at the forefront of the debate. In a country where CCTV cameras line the streets, police surveillance has traditionally been more accepted than in other Western countries.
The technology London plans to deploy goes beyond many of the facial recognition systems used elsewhere, which match a photo against a database to identify a person. The new systems, created by the company NEC, attempt to identify people on a police watch list in real time with security cameras, giving officers a chance to stop them in the specific location.
Under pressure to address rising crime, the Metropolitan Police said in a statement that the technology would help quickly identify and apprehend suspects and help “tackle serious crime, including serious violence, gun and knife crime, child sexual exploitation and help protect the vulnerable.”
“Every day, our police officers are briefed about suspects they should look out for,” Nick Ephgrave, assistant commissioner of the police department, said in the statement. Live facial recognition, he said, “improves the effectiveness of this tact.”
“As a modern police force, I believe that we have a duty to use new technologies to keep people safe in London,” he added.
Already widespread in China, facial recognition is gaining traction in Western countries. An investigation by The New York Times this month found that more than 600 law enforcement agencies are using a facial recognition system by the company Clearview AI. According to researchers at Georgetown University, cities including New York, Chicago, Detroit and Washington have at least piloted the use of the real-time systems.
Use of the facial recognition technology has generated a backlash. San Francisco, Oakland and Berkeley in California, along with Somerville and Brookline in Massachusetts, have banned its use.
Privacy groups immediately criticized London’s decision and vowed to take legal action to try to stop its deployment.
“This decision represents an enormous expansion of the surveillance state and a serious threat to civil liberties in the U.K.,” said Silkie Carlo, director of Big Brother Watch, a London-based group that has been fighting the use of facial recognition. “This is a breathtaking assault on our rights and we will challenge it.”
Last year, a British judge said that police departments could use the technology without violating privacy or human rights, a case that is under appeal. The government’s top privacy regulator has raised concerns about the use of the technology, as did an independent report of a trial use by the Metropolitan Police.
The Metropolitan Police said it would be transparent about deploying the technology. Officers will post signs and hand out leaflets when the cameras are in use.
Researchers have found problems with many facial recognition systems, including trouble accurately identifying people who are not white men. Civil liberties groups point to flaws in the technology as a reason it should not be deployed, arguing it will lead to constant surveillance and hinder free movement.
In Britain, an independent review last year found many problems with a police trial of facial recognition, including its accuracy. Of 42 identifications made by the system in one trial, only eight were correct.
“It was incredibly inaccurate,” said Daragh Murray, a senior lecturer at the University of Essex who conducted the report. “Most times they didn’t actually find the people they were looking for. From just a technological perspective, you have to question the utility.”
Mr. Murray said that without clear laws about how the technology is used police departments everywhere have wide latitude to put the camera systems in place. Particularly concerning, he said, is the lack of transparency about how police decide when somebody is placed on a watch list.
“Too much leeway is given to the police,” Mr. Murray said. “What is needed is proper safeguards around its use.”
Britain’s Information Commissioner’s Office, the country’s top privacy regulator, said it would monitor how the system is deployed. It said the police gave assurances that the department would take steps to reduce privacy and data-protection risks.
“This is an important new technology with potentially significant privacy implications for U.K. citizens,” the privacy regulator said in a statement.