close
close

As police increasingly use facial recognition technology, calls for regulations grow

MONTREAL — Some police agencies in Canada are using facial recognition technology to help solve crimes, while other police agencies say human rights and privacy concerns are keeping them from using powerful digital tools.

MONTREAL — Some police agencies in Canada are using facial recognition technology to help solve crimes, while other police agencies say human rights and privacy concerns are keeping them from using powerful digital tools.

It’s the uneven application of the technology – and lax rules for its use – that have legal and artificial intelligence experts calling on the federal government to set national standards.

“Until we better understand the risks of using this technology, there should be a moratorium or a series of prohibitions on how and where it can be used,” says Kristen Thomasen, a law professor at the University of British Columbia.

Moreover, inconsistent regulations regarding new biometric technologies have led to situations in which the privacy rights of some citizens are better protected than others.

“I think the fact that different police forces are taking different steps raises concerns (about) inequality and the treatment of people across the country, but (it) also underscores the continuing importance of taking some type of action at the federal level,” she said. he said.

Facial recognition systems are a form of biometric technology that uses artificial intelligence to identify people by comparing images or videos of their faces — often captured by security cameras — with their existing likenesses in databases. The technology is a controversial tool in the hands of police.

In 2021, the Office of the Privacy Commissioner of Canada found that the RCMP had breached privacy laws when it used the technology without the public’s knowledge. That same year, Toronto police admitted that some of their officers had used facial recognition software without informing their chief. In both cases, the technology was provided by a US company called Clearview AI, whose database consisted of billions of images scraped from the internet without the consent of the people whose images were used.

Last month, York and Peel police in Ontario said they had begun deploying facial recognition technology provided by multinational French company Idemia. In an interview, York Police Officer Kevin Nebrija said the tools “help speed up investigations and identify suspects faster,” adding that in terms of privacy, “nothing has changed because security cameras are everywhere.”

And in neighboring Quebec, Montreal police chief Fady Dagher says police forces will not introduce such biometric identification tools without a debate on issues such as human rights and privacy.

“It’s going to take a lot of discussion before we start thinking about implementing it,” Dagher said in a recent interview.

Nebrija stressed that the department has consulted with the Ontario Privacy Commissioner to determine best practices, adding that the images police will obtain will be “obtained in accordance with law” — either with the cooperation of the owners of the security cameras or by obtaining a court order for the images.

And while York police say officers will seek legal approval, Kate Robertson, senior researcher at the University of Toronto’s Citizen Lab, says Canadian police tend to do exactly the opposite.

Robertson said that since revelations about Toronto police’s use of Clearview AI in 2019-2020, she is “still not aware of any police in Canada that has obtained prior approval from a judge to use facial recognition technology in their investigations.”

Obtaining court approval, usually in the form of an order, is the “gold standard for privacy protection in criminal investigations,” according to Robertson. This ensures that facial recognition, when used, is appropriately balanced with the rights to freedom of expression, freedom of assembly and other rights enshrined in the Charter.

While the federal government has no jurisdiction over provincial and municipal police forces, it could amend the Criminal Code to include legal requirements for facial recognition software, in the same way it updated the law to include voice recording technologies that could be used for supervision.

In 2022, Canada’s federal, provincial and territorial heads of privacy commissions called on lawmakers to establish a legal framework for the appropriate use of facial recognition technology, including empowering independent oversight bodies, banning mass surveillance and limiting how long images can be stored in databases.

Meanwhile, the federal Department of Economic Development said Canadian law “may potentially” regulate companies’ collection of personal data under the Personal Information Protection and Electronic Documents Act, or PIPEDA.

“If, for example, the police, including the RCMP, were to outsource activities using personal data to a private company engaged in commercial activities, then those activities could potentially be subject to PIPEDA regulation, including services related to facial recognition technologies,” he added. the department said.

Quebec Provincial Police also has a contract with Idemia, but has not disclosed exactly how it uses the company’s technology.

In an emailed statement, police said its “automatic facial matching system is not used to check people’s identities. This tool is used in criminal investigations and is limited to data sheets of persons whose fingerprints have been taken under the Criminal Identification Act.” “

AI management expert Ana Brandusescu says that Ottawa and the country’s police forces have failed to heed calls for better governance, transparency and accountability in the procurement of facial recognition technology.

“Law enforcement is not listening to scientists, civil society experts, people with lived experience and people who have been directly harmed,” she said.

This report by The Canadian Press was first published June 30, 2024.

Joe Bongiorno, The Canadian Press