The Edmonton Police Service announced Tuesday it will become the first police force in the world to use an artificial intelligence (AI) product from Axon Enterprise to trial facial-recognition-enabled bodycams.
“I want to make it clear that this facial-recognition technology will not replace the human component of investigative work,” acting Supt. Kurt Martin with EPS’ information and analytics division said during a news conference.
“In fact, the resemblances that are identified by this software will be human-verified by officers trained in facial recognition.”
Martin said the police force’s goal is to test another tool in its operations toolbox that can help further ensure public and officer safety while also respecting privacy considerations.
Axon Enterprise, an Arizona-based company, develops weapons and technology products for military, law enforcement and civilians in jurisdictions where legal.


Axon’s rep basically says that their mass surveillance cameras don’t see colour, just people. Then follows with the main factor is skin tone (??). A problem that was essentially noted as far back as…2019. What development in the technology is she talking about?
According to Ann-Li Cooke, Axon Enterprise’s director of responsible AI:
Also note that the facial-recognition technology seems to have a fatal flaw when it comes to women with darker skin.
You know what was a problem with the technology back in 2019? LLMs are coded by primarily white males, and their idea for “normal” hard codes bias into the models. These “AI” products essentially show their coders’ bias by discriminating what falls outside of that normal.
For example, from “How tech’s white male workforce feeds bias into AI”, by Aimee Picchi:
https://www.cbsnews.com/news/ai-bias-problem-techs-white-male-workforce/