British Columbia police have confirmed the use of a controversial facial recognition software.
According to a new report from CBC, British Columbia’s privacy commissioner confirmed that five police officers and one civilian have used the Clearview AI facial recognition software despite privacy violation concerns.
“It wasn’t surprising that officers individually might want to take advantage of technologies they might think to be helpful,” said B.C. Information and Privacy Commissioner Michael McEvoy. “But what’s important is that they need to think about issues such as whether those technologies violate our privacy laws.”
McEvoy wouldn’t confirm which B.C. police force used the software, and said that the civilian involved was a single employee at a private company.
Last week, the privacy commissioner released a report that found ClearviewAI’s technology posed a significant risk to individuals by “allowing law enforcement and companies to match photos against its database of more than three billion images, including Canadians and children.”
This investigation found that Clearview AI collected images in Canada and marketed its services to Canadian police forces.
The report notes that a number of Canadian law enforcement agencies, including the RCMP, Toronto, and Calgary police, had been using the advanced technology to help identify perpetrators and victims of crimes.
Banning the use of facial recognition is one targeted measure that can rein in emerging concerns about aggressive policing. Many privacy advocates are concerned that the AI-powered face recognition systems would not only disproportionately target communities of color, but that the tech has been demonstrated to have technical shortcomings in discerning non-white faces.