AI-powered police body cameras, once taboo, get tested on Canadian city’s ‘watch list’ of faces

AI-powered police body cameras, once taboo, get tested on Canadian city’s ‘watch list’ of faces



Police body cameras equipped with artificial intelligence have been trained to detect the faces of about 7,000 people on a โ€œhigh riskโ€ watch list in the Canadian city of Edmonton, a live test of whether facial recognition technology shunned as too intrusive could have a place in policing throughout North America.

But six years after leading body camera maker Axon Enterprise, Inc. said police use of facial recognition technology posed serious ethical concerns, the pilot project โ€” switched on last weekโ€” is raising alarms far beyond Edmonton, the continentโ€™s northernmost city of more than 1 million people.

A former chair of Axonโ€™s AI ethics board, which led the company to temporarily abandon facial recognition in 2019, told The Associated Press heโ€™s concerned that the Arizona-based company is moving forward without enough public debate, testing and expert vetting about the societal risks and privacy implications.

โ€œItโ€™s essential not to use these technologies, which have very real costs and risks, unless thereโ€™s some clear indication of the benefits,โ€ said the former board chair, Barry Friedman, now a law professor at New York University.

Axon founder and CEO Rick Smith contends that the Edmonton pilot is not a product launch but โ€œearly-stage field researchโ€ that will assess how the technology performs and reveal the safeguards needed to use it responsibly.

โ€œBy testing in real-world conditions outside the U.S., we can gather independent insights, strengthen oversight frameworks, and apply those learnings to future evaluations, including within the United States,โ€ Smith wrote in a blog post.

The pilot is meant to help make Edmonton patrol officers safer by enabling their body-worn cameras to detect anyone who authorities classified as having a โ€œflag or cautionโ€ for categories such as โ€œviolent or assaultive; armed and dangerous; weapons; escape risk; and high-risk offender,โ€ said Kurt Martin, acting superintendent of the Edmonton Police Service. So far, that watch list has 6,341 people on it, Martin said at a Dec. 2 press conference. A separate watch list adds 724 people who have at least one serious criminal warrant, he said.

โ€œWe really want to make sure that itโ€™s targeted so that these are folks with serious offenses,” said Ann-Li Cooke, Axonโ€™s director of responsible AI.

If the pilot expands, it could have a major effect on policing around the world. Axon, a publicly traded firm best known for developing the Taser, is the dominant U.S. supplier of body cameras and has increasingly pitched them to police agencies in Canada and elsewhere. Axon last year beat its closest competitor, Chicago-based Motorola Solutions, in a bid to sell body cameras to the Royal Canadian Mounted Police.

Motorola said in a statement that it also has the ability to integrate facial recognition technology into police body cameras but, based on its ethical principles, has โ€œintentionally abstained from deploying this feature for proactive identification.” It didn’t rule out using it in the future.

The government of Alberta in 2023 mandated body cameras for all police agencies in the province, including its capital city Edmonton, describing it as a transparency measure to document police interactions, collect better evidence and reduce timelines for resolving investigations and complaints.

While many communities in the U.S. have also welcomed body cameras as an accountability tool, the prospect of real-time facial recognition identifying people in public places has been unpopular across the political spectrum. Backlash from civil liberties advocates and a broader conversation about racial injustice helped push Axon and Big Tech companies to pause facial recognition software sales to police.

Among the biggest concerns were studies showing that the technology was flawed, demonstrating biased results by race, gender and age. It also didn’t match faces as accurately on real-time video feeds as it did on faces posing for identification cards or police mug shots.

Several U.S. states and dozens of cities have sought to curtail police use of facial recognition, though President Donald Trump’s administration is now trying to block or discourage states from regulating AI.

The European Union banned real-time public face-scanning police technology across the 27-nation bloc, except when used for serious crimes like kidnapping or terrorism.

But in the United Kingdom, no longer part of the EU, authorities started testing the technology on London streets a decade ago and have used it to make 1,300 arrests in the past two years. The government is considering expanding its use across the country.

Many details about Edmonton’s pilot haven’t been publicly disclosed. Axon doesn’t make its own AI model for recognizing faces but declined to say which third-party vendor it uses.

Edmonton police say the pilot will continue through the end of December and only during daylight hours.

โ€œObviously it gets dark pretty early here,โ€ Martin said. โ€œLighting conditions, our cold temperatures during the wintertime, all those things will factor into what weโ€™re looking at in terms of a successful proof of concept.โ€

Martin said about 50 officers piloting the technology won’t know if the facial recognition software made a match. The outputs will be analyzed later at the station. In the future, however, it could help police detect if there’s a potentially dangerous person nearby so they can call in for assistance, Martin said.

That’s only supposed to happen if officers have started an investigation or are responding to a call, not simply while strolling through a crowd. Martin said officers responding to a call can switch their cameras from a passive to an active recording mode with higher-resolution imaging.

โ€œWe really want to respect individualsโ€™ rights and their privacy interests,โ€ Martin said.

The office of Albertaโ€™s information and privacy commissioner Diane McLeod said she received a privacy impact assessment from Edmonton police on Dec. 2, the same day Axon and police officials announced the program. The office said Friday itโ€™s now working to review the assessment, a requirement for projects that collect โ€œhigh sensitivityโ€ personal data.

University of Alberta criminology professor Temitope Oriola said he’s not surprised that the city is experimenting with live facial recognition, given that the technology is already ubiquitous in airport security and other environments.

โ€œEdmonton is a laboratory for this tool,โ€ Oriola said. โ€œIt may well turn out to be an improvement, but we do not know that for sure.โ€

Oriola said the police service has had a sometimes โ€œfrostyโ€ relationship with its Indigenous and Black residents, particularly after the fatal police shooting of a member of the South Sudanese community last year, and it remains to be seen whether facial recognition technology makes policing safer or improves interactions with the public.

Axon has faced blowback for its technology deployments in the past, as in 2022, when Friedman and seven other members of Axon’s AI ethics board resigned in protest over concerns about a Taser-equipped drone.

In the years since Axon opted against facial recognition, Smith, the CEO, says the company has โ€œcontinued controlled, lab-based researchโ€ of a technology that has โ€œbecome significantly more accurateโ€ and is now ready for trial in the real world.

But Axon acknowledged in a statement to the AP that all facial recognition systems are affected by “factors like distance, lighting and angle, which can disproportionately impact accuracy for darker-skinned individuals.โ€

Every match requires human review, Axon said, and part of its testing is also โ€œlearning what training and oversight human reviewers must have to mitigate known risks.โ€

Friedman said Axon should disclose those evaluations. He’d want to see more evidence that facial recognition has improved since his board concluded that it wasn’t reliable enough to ethically justify its use in police cameras.

Friedman said he’s also concerned about police agencies greenlighting the technology’s use without deliberation by local legislators and rigorous scientific testing.

โ€œItโ€™s not a decision to be made simply by police agencies and certainly not by vendors,” he said. โ€œA pilot is a great idea. But thereโ€™s supposed to be transparency, accountability. … None of thatโ€™s here. Theyโ€™re just going ahead. They found an agency willing to go ahead and theyโ€™re just going ahead.โ€

โ€”-

AP writer Kelvin Chan in London contributed to this report.

Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.

Leave a Reply

Your email address will not be published. Required fields are marked *