close
close

As police increasingly use facial recognition technology, calls for legislation are growing

MONTREAL — Some police forces in Canada are using facial recognition technology to help solve crimes, while other police forces say human rights and privacy concerns are keeping them from using advanced digital tools.

It’s the uneven application of the technology – and lax rules for its use – that have legal and artificial intelligence experts calling on the federal government to set national standards.

“Until we better understand the risks associated with the use of this technology, there should be a moratorium or series of prohibitions on how and where it can be used,” says Kristen Thomasen, a law professor at the University of British Columbia.

Moreover, the patchwork of regulations surrounding emerging biometric technologies has led to situations where the privacy rights of some citizens are better protected than others.

“I think the fact that different police forces are taking different steps raises concerns (about) inequality and how people are treated across the country, but (it) also underscores the continued importance of some type of action being taken at the federal level,” she said.

Facial recognition systems are a form of biometric technology that uses artificial intelligence to identify people by comparing images or videos of their faces – often captured by security cameras – with images of them that exist in databases. This technology is a controversial tool in the hands of the police.

In 2021, the Office of the Privacy Commissioner of Canada found that the RCMP violated privacy laws by using the technology without the public’s knowledge. That same year, Toronto police admitted that some of its officers had used facial recognition software without informing their chief. In both cases, the technology was provided by the American company Clearview AI, whose database consisted of billions of images downloaded from the Internet without the consent of the people whose photos were used.

Last month, York and Peel police in Ontario said they had begun deploying facial recognition technology provided by multinational French company Idemia. In an interview, York Police Officer Kevin Nebrija said the tools “help speed up investigations and identify suspects faster,” adding that in terms of privacy, “nothing has changed because security cameras are everywhere.”

But in neighbouring Quebec, Montreal police chief Fady Dagher says police will not adopt such biometric identification tools without a debate on issues ranging from human rights to privacy.

“It will take a lot of discussion before we even start thinking about implementing it,” Dagher said in a recent interview.

Nebrija stressed that the department has consulted with the Ontario Privacy Commissioner to determine best practices, adding that the images police will obtain will be “obtained in accordance with law” — either with the cooperation of the owners of the security cameras or by obtaining a court order for the images.

And while York police say officers will seek legal approval, Kate Robertson, senior researcher at the University of Toronto’s Citizen Lab, says Canadian police tend to do exactly the opposite.

Since revelations that Toronto police used Clearview AI in 2019-20, Robertson said she is “still not aware of any police service in Canada that has obtained prior judicial approval to use facial recognition technology in their investigations.”

According to Robertson, obtaining court approval, usually in the form of a warrant, is the “gold standard for privacy protection in criminal investigations.” This ensures that facial recognition, when used, is properly balanced with the rights to freedom of expression, freedom of assembly, and other Charter rights.

While the federal government does not have jurisdiction over provincial and municipal police, it can amend the Criminal Code to include legal requirements for facial recognition software in the same way it has updated the law to include voice-recording technologies that can be used for surveillance.

In 2022, federal, provincial and territorial heads of Canada’s privacy commissions called on lawmakers to establish a legal framework for the appropriate use of facial recognition technology, including empowering independent oversight bodies, prohibiting mass surveillance and limiting the length of time images are stored in databases.

Meanwhile, the federal Department of Economic Development said Canadian law “has the potential” to regulate corporations’ collection of personal information under the Personal Information Protection and Electronic Documents Act, or PIPEDA.

“For example, if a police force, including the RCMP, were to engage a private commercial company to perform activities that utilize personal information, then those activities could potentially be subject to PIPEDA, including services related to facial recognition technologies,” the department said.

Quebec Provincial Police also has a contract with Idemia, but would not say exactly how it uses the company’s technology.

In an emailed statement, the police said its “automated facial matching system is not used to verify the identity of individuals.” The tool is used in criminal investigations and is limited to data sheets of individuals fingerprinted under the Offenders Identification Act.

Says artificial intelligence management expert Ana Brandusescu Ottawa and the country’s police forces are failing to respond to calls for better governance, transparency and accountability when awarding contracts for facial recognition technology.

“Law enforcement agencies are not listening to scientists, civil society experts, people with life experience and people who have directly suffered harm,” she said.

This report by The Canadian Press was first published June 30, 2024.

Joe Bongiorno, The Canadian Press