Edmonton police have quietly launched a controversial AI experiment: body cameras that can scan the faces of thousands of people deemed “high risk” as officers patrol the streets of Alberta’s capital city.
About 50 officers are now equipped with body cameras that use artificial intelligence to detect the faces of roughly 7,000 individuals flagged on watch lists — including those considered armed and dangerous or with outstanding warrants. The pilot program, which will run through December 2024, marks one of the first real-world deployments of real-time facial recognition in police body cameras in North America, raising significant questions about privacy, accuracy and public oversight.
“Edmonton is a laboratory for this tool,” University of Alberta criminology professor Temitope Oriola noted. “It may well turn out to be an improvement, but we do not know that for sure.”
Watching the Watchlist
The technology, developed by Axon (formerly known as Taser), scans for 6,341 individuals flagged for serious risks such as violent behavior or escape risk, plus another 724 people with at least one serious criminal warrant, according to Edmonton police officials at a December press conference.
“We really want to make sure that it’s targeted so that these are folks with serious offenses,” said Ann-Li Cooke, Axon’s director of responsible AI, who emphasized the limited scope of the system.
For now, the system doesn’t alert officers in real-time. The facial recognition outputs are analyzed later at the police station. However, future implementations could potentially notify officers when a flagged individual is nearby during calls or investigations.
Interestingly, the pilot is limited to daylight hours only. Why? “Obviously it gets dark pretty early here,” Edmonton police explained. “Lighting conditions, our cold temperatures during the wintertime, all those things will factor into what we’re looking at in terms of a successful proof of concept.”
Testing Ground Beyond U.S. Borders
Axon founder and CEO Rick Smith described the Edmonton pilot as “early-stage field research” designed to evaluate the technology under real-world conditions before potential wider deployment.
“By testing in real-world conditions outside the U.S., we can gather independent insights, strengthen oversight frameworks, and apply those learnings to future evaluations, including within the United States,” Smith wrote in a blog post.
The timing is notable. The government of Alberta mandated body cameras for all police agencies in the province in 2023 as a transparency and accountability measure. Edmonton is now the first city in Alberta implementing AI facial recognition capabilities in these cameras.
Meanwhile, Axon competitor Motorola Solutions has taken a different approach. While technically capable of adding facial recognition to police body cameras, the company has “intentionally abstained from deploying this feature for proactive identification” based on ethical principles.
Accuracy Concerns and Ethical Questions
The pilot has sparked significant debate. Axon acknowledged to the Associated Press that all facial recognition systems face accuracy challenges affected by “factors like distance, lighting and angle, which can disproportionately impact accuracy for darker-skinned individuals.” The company emphasized that every match requires human review.
Barry Friedman, former chair of Axon’s AI ethics board and now a law professor at New York University, expressed concern about the lack of public deliberation: “It’s essential not to use these technologies, which have very real costs and risks, unless there’s some clear indication of the benefits.”
He added: “It’s not a decision to be made simply by police agencies and certainly not by vendors. A pilot is a great idea. But there’s supposed to be transparency, accountability.… None of that’s here. They’re just going ahead.”
A privacy impact assessment for the facial recognition pilot was submitted to Alberta’s information and privacy commissioner, Diane McLeod, on December 2. Her office is now reviewing it, a requirement for projects collecting “high sensitivity” personal data.
The deployment raises particular questions in Edmonton, where relations between police and Indigenous and Black communities have been strained in the past. Critics wonder whether the technology will enhance safety or further complicate community relations.
As cities around North America grapple with the implications of AI in policing, Edmonton’s experiment may well set precedents for how — or whether — such technology should be deployed on streets across the continent.

