close
close

As police increasingly use facial recognition technology, calls for regulations grow

“I think the fact that we have different police forces taking different steps raises concerns about inequality and how people are treated across the country, but it also underscores the continued importance of taking some action at the federal level,” she said.

Facial recognition systems are a form of biometric technology that uses artificial intelligence to identify people by comparing images or videos of their faces – often captured by security cameras – with existing images in databases. The technology has been a controversial tool in the hands of police.

In 2021, the Office of the Privacy Commissioner of Canada found that the RCMP had breached privacy laws by using the technology without the public’s knowledge. That same year, Toronto police admitted that some of their officers had used facial recognition software without informing their boss. In both cases, the technology was provided by the American company Clearview AI, whose database consisted of billions of images downloaded from the internet without the consent of the people whose photos were used.

Last month, York and Peel police in Ontario said they had begun deploying facial recognition technology provided by multinational French company Idemia. In an interview, York Police Officer Kevin Nebrija said the tools “help speed up investigations and identify suspects faster,” adding that in terms of privacy, “nothing has changed because security cameras are everywhere.”

But in neighboring Quebec, Montreal police chief Fady Dagher says police will not adopt such biometric identification tools without debate on issues ranging from human rights to privacy.

“It’s going to take a lot of discussion before we start thinking about implementing it,” Dagher said in a recent interview.

Nebrija stressed that the department has consulted with the Ontario Privacy Commissioner on best practices, adding that the images police will obtain will be “obtained lawfully” either with the cooperation of the owners of the security cameras or by obtaining court orders for the images.

And while York police urge officers to turn to judicial authorities, Kate Robertson, a senior researcher at the University of Toronto’s Citizen Lab, says Canadian police have long done the opposite.

Since the revelation that Toronto police used Clearview AI between 2019 and 2020, Robertson said she is “still unaware of any police service in Canada that has obtained prior approval from a judge to use facial recognition technology in its investigations.”

According to Robertson, obtaining court approval, usually in the form of a warrant, is the “gold standard for protecting privacy in criminal investigations.” This ensures that a facial recognition tool, if used, is properly balanced with the rights to freedom of expression, freedom of assembly, and other Charter rights.

While the federal government has no jurisdiction over provincial and municipal police forces, it can amend the Criminal Code to include legal requirements for facial recognition software, in the same way it updated the law to include voice-recording technologies that could be used for surveillance.

In 2022, the heads of Canada’s federal, provincial and territorial privacy commissions called on lawmakers to establish a legal framework to enable the appropriate use of facial recognition technology, including empowering independent oversight bodies, banning mass surveillance and limiting how long images can be stored in databases.

Meanwhile, the federal Department of Economic Development said Canadian law “may potentially” regulate companies’ collection of personal data under the Personal Information Protection and Electronic Documents Act, or PIPEDA.

“For example, if a police force, including the RCMP, were to engage a private commercial company to perform activities that utilize personal information, then those activities could potentially be subject to PIPEDA, including services related to facial recognition technologies,” the department said.

The Quebec provincial police also contracted with Idemia, but have not disclosed exactly how they are using the company’s technology.

In an emailed statement, the police said its “automated facial matching system is not used to verify the identity of individuals.” The tool is used in criminal investigations and is limited to data sheets of individuals fingerprinted under the Offenders Identification Act.

Says artificial intelligence management expert Ana Brandusescu Ottawa and the country’s police forces have failed to heed calls for better governance, transparency and accountability in the procurement of facial recognition technology.

“Law enforcement agencies are not listening to scientists, civil society experts, people with life experience and people who have directly suffered harm,” she said.

This report by The Canadian Press was first published June 30, 2024.

Joe Bongiorno, The Canadian Press