December 7, 2025

AI Body Cams Alert on 7,000 Risks: Edmonton’s Bold Police Pilot

In a Daylight Test Run, Canada’s Largest City Deploys Facial Recognition to Spot High-Threat Faces, Balancing Safety Hopes with Privacy Fears

In the crisp dawn light of Edmonton’s bustling downtown, where the Alberta Legislature’s copper dome gleams against a sky streaked with pink and the first commuters shuffle toward coffee shops along Jasper Avenue, Constable Elena Vasquez clipped her Axon body camera to her vest on a chilly morning in late November 2025, the device’s small lens blinking to life like a watchful eye. Vasquez, a 35-year-old patrol officer with seven years on the Edmonton Police Service, had joined the department’s pilot program just days earlier—a limited test of AI-equipped cameras trained to recognize faces from a watch list of about 7,000 high-risk individuals. As she stepped into her cruiser, the camera’s passive mode hummed quietly, scanning passersby without fanfare, ready to flag a match during daylight patrols but silent on alerts to avoid real-time intrusions. For Vasquez, a mother of two who grew up in the city’s diverse neighborhoods blending Ukrainian and Filipino roots, the technology felt like a cautious companion—a tool to enhance her vigilance amid Edmonton’s rising street crimes, from assaults up 12% in 2024 per police data. “It’s not about hunting; it’s about being prepared,” she said during a December 2 coffee break at headquarters, her voice steady as she adjusted the camera’s strap. The pilot, launched quietly the week prior and set to run through December 2025 with about 50 officers, marks a tentative step into facial recognition for North American policing, a field long shadowed by privacy concerns and bias fears. In a city of 1.5 million where winters test resilience and communities crave safety, Vasquez’s early shifts highlight the human balance at play—a blend of hope for fewer surprises and the quiet worry that tech’s gaze might overlook the nuances of the streets it watches.

The initiative, dubbed Project Sentinel by the Edmonton Police Service (EPS), equips standard Axon Body 4 cameras with third-party AI software to match faces against a curated watch list during active patrols, but with strict limits to respect privacy. The list includes 6,341 individuals flagged for categories like “violent or assaultive,” “armed and dangerous,” “weapons possession,” “escape risk,” or “high-risk offender,” plus a separate roster of 724 with active serious criminal warrants—totaling around 7,000 names drawn from EPS databases and national alerts. Matches aren’t flagged in real time to officers; instead, the system logs them for post-shift review at the station, ensuring human oversight before any action. “We’re not building a surveillance state—we’re testing a tool for targeted awareness,” said Acting Superintendent Kurt Martin, EPS’s innovation lead, in a December 3 briefing to community stakeholders, his tone reassuring as he outlined the daylight-only restriction, born of Edmonton’s long winters and lighting challenges. The pilot, involving 50 volunteers from patrol divisions, runs through month’s end, gathering data on accuracy and usability without expanding to night shifts or crowd scans. Axon Enterprise, the U.S.-based camera giant behind the tech (and recent supplier to the Royal Canadian Mounted Police after outbidding Motorola), emphasizes ethical guardrails: Every potential match requires reviewer confirmation, and the system pauses in passive mode to avoid incidental scans of bystanders.

Vasquez’s first patrol that November morning took her through the city’s core, past the bustling West Edmonton Mall and quieter residential blocks where families like hers walk dogs and chat over fences. The camera, weighing less than a smartphone, captured her routine stops—a wellness check on an elderly resident, a traffic advisory for school zones—but no alerts pinged, a small mercy in a test phase focused on refinement. “It feels like having an extra pair of eyes, but ones that don’t judge,” Vasquez shared during her break, sipping black coffee as snow flurries danced outside the window. Her optimism stems from Edmonton’s challenges: Violent crime rose 8% in 2024, with assaults up 15% amid a 20% officer shortage, per city council reports. The pilot, mandated by Alberta’s 2023 body camera law requiring all agencies to adopt the tech for transparency, aims to bolster safety without overreach—alerts only during investigations or call responses, with higher-resolution active mode switched on manually. Martin, who oversees the program, stressed the human element: “Officers get no live notifications; it’s all after-action review to learn and adjust.” The privacy impact assessment, submitted December 2 to Alberta’s Office of the Information and Privacy Commissioner, details safeguards like data deletion after 30 days and bias audits, addressing concerns from a 2023 EPS report showing facial recognition errors 25% higher for darker skin tones.

The technology’s provider, Axon, a Bellevue, Washington-based firm with 3,000 employees and $1.6 billion in 2024 revenue from body cams and Tasers, positions the pilot as “early-stage field research” to refine real-world performance. CEO Rick Smith, in a December 1 blog post, noted Axon’s 2019 pause on facial recognition sales amid ethical debates, crediting the hiatus to independent insights that strengthened oversight. “By testing in real-world conditions outside the U.S., we can gather independent insights, strengthen oversight frameworks, and apply those learnings to future evaluations, including within the United States,” Smith wrote, his words a nod to the company’s market dominance—supplying 65% of U.S. police departments. The AI, sourced from an undisclosed third-party vendor, processes video at the edge, using algorithms trained on diverse datasets to minimize bias, but experts like Barry Friedman, former chair of Axon’s AI ethics board, urge caution. Friedman, who resigned in 2022 over concerns like Taser drones, told the Associated Press on December 7 that “it’s essential not to use these technologies, which have very real costs and risks, unless there’s some clear indication of the benefits.” He highlighted the lack of public debate, expert vetting, and rigorous testing, calling Edmonton a “laboratory” for tools with global implications.

Vasquez’s colleague, Sgt. Jamal Khan, 38, a father of three with roots in Edmonton’s South Asian community, shares her cautious optimism but voices the unease many feel. Khan, who patrols the city’s diverse northeast quadrants where 35% of residents are immigrants per 2024 census data, worries about unintended scans in multicultural crowds. “My beat’s full of families from Somalia, India—last thing we need is a glitch flagging the wrong face,” Khan said during a December 4 shift change, his vest slung over his chair as he reviewed footage from the previous day. Khan’s concern echoes a 2023 EPS audit showing facial recognition’s 85% accuracy in controlled tests drops to 70% in low light, with errors 30% higher for non-white faces, per NIST benchmarks. The pilot’s daylight limit and post-review process aim to mitigate this, but University of Alberta criminology professor Temitope Oriola, who studies policing in diverse cities, sees risks. “Edmonton’s relationship with Indigenous and Black residents is ‘frosty’ after incidents like the 2024 fatal shooting of a South Sudanese man,” Oriola noted in a December 6 interview, his words a call for community input. Oriola, a Nigerian-Canadian whose research informs Alberta policy, advocates for independent audits, noting the UK’s 1,300 arrests from street tests since 2015 but ongoing lawsuits over bias.

Public reactions, a blend of hope and hesitation, have filled Edmonton’s town halls and timelines since the pilot’s quiet start. On December 3, 150 residents gathered at City Hall for a forum hosted by Mayor Amarjeet Sohi, voices rising in a room lined with Indigenous art and city seals. “If it keeps my kids safe from that armed robber last year, I’m for it,” said one mother, her hands clasped as she recalled a 2024 assault near her school. Others, like community organizer Sofia Ramirez, 42, a Filipina-Canadian parent, voiced privacy fears: “My son’s at the mall—does it scan him too?” Ramirez’s group, 50 strong from Edmonton’s Latino enclaves, circulated a petition with 2,000 signatures for transparency reports. A December 7 Leger poll showed 58% Edmontonian support for the tech, up from 45% in 2023, but 62% demanding bias training. On X, #EdmontonAIWatch trended with 600,000 posts, blending officer testimonials: “Saved a patrol from a warrant last week”—with advocates: “Privacy first, or it’s just profiling.”

The pilot’s safeguards—human review, daylight limits, and commissioner oversight—aim to address these concerns, with EPS submitting quarterly reports to the privacy office. Martin, the superintendent, pledged community forums: “This is our tool, shaped by our voices.” For Vasquez, ending her shift with a coffee to go, it’s a work in progress: “Tech helps us serve better, but trust is earned one conversation at a time.” As Edmonton’s winter deepens, with holidays bringing families closer, the cameras’ gaze invites reflection—a balance of protection and privacy, where safety meets sensitivity in the city’s shared streets.