close
close

AI Police: Calls for regulation of facial recognition technology are growing

Some police forces in Canada are using facial recognition technology to help solve crimes, while other police forces say human rights and privacy concerns prevent them from using these powerful digital tools.

It’s the uneven application of the technology — and loose rules governing its use — that have legal and artificial intelligence experts calling on the federal government to establish national standards.

“Until the risks of using this technology are better managed, there should be a moratorium or a series of prohibitions on how and where it can be used,” says Kristen Thomasen, a law professor at the University of British Columbia.

Moreover, the patchwork of regulations surrounding emerging biometric technologies has led to situations where the privacy rights of some citizens are better protected than others.

“I think the fact that different police forces are taking different steps raises concerns (about) inequality and the treatment of people across the country, but (it) also underscores the continuing importance of taking some type of action at the federal level,” she said. he said.

Facial recognition systems are a form of biometric technology that uses artificial intelligence to identify people by comparing images or video of their faces — often captured by security cameras — with existing images in databases. The technology has been a controversial tool in the hands of police.

In 2021, the Office of the Privacy Commissioner of Canada found that the RCMP had breached privacy laws when it used the technology without the public’s knowledge. That same year, Toronto police admitted that some of their officers had used facial recognition software without informing their chief. In both cases, the technology was provided by a US company called Clearview AI, whose database consisted of billions of images scraped from the internet without the consent of the people whose images were used.

Last month, York and Peel police in Ontario said they had begun deploying facial recognition technology from French multinational Idemia. In an interview, York police officer Kevin Nebrija said the tools “help speed up investigations and identify suspects faster,” adding that in terms of privacy, “nothing has changed because security cameras are everywhere.”

And in neighboring Quebec, Montreal police chief Fady Dagher says police forces will not introduce such biometric identification tools without a debate on issues such as human rights and privacy.

“It’s going to take a lot of discussion before we start thinking about implementing it,” Dagher said in a recent interview.

Nebrija emphasized that the department has consulted with the Privacy Commissioner of Ontario to determine best practices, adding that the images police obtain will be “obtained lawfully” – either with the cooperation of the owners of the security cameras or after obtaining a court order for the photos.

And while York police urge officers to turn to judicial authorities, Kate Robertson, a senior researcher at the University of Toronto’s Citizen Lab, says Canadian police have long done the opposite.

Since the revelation that Toronto police used Clearview AI between 2019 and 2020, Robertson said she is “still unaware of any police service in Canada that has obtained prior approval from a judge to use facial recognition technology in its investigations.”

Obtaining court approval, usually in the form of an order, is the “gold standard for privacy protection in criminal investigations,” according to Robertson. This ensures that facial recognition, if used, is appropriately balanced with the rights to freedom of expression, freedom of assembly and other Charter rights.

While the federal government does not have jurisdiction over provincial and municipal police, it can amend the Criminal Code to include legal requirements for facial recognition software in the same way it has updated the law to include voice-recording technologies that can be used for surveillance.

In 2022, Canada’s federal, provincial and territorial heads of privacy commissions called on lawmakers to establish a legal framework for the appropriate use of facial recognition technology, including empowering independent oversight bodies, banning mass surveillance and limiting how long images can be stored in databases.

Meanwhile, the federal Department of Economic Development said Canadian law “has the potential” to regulate companies’ collection of personal information under the Personal Information Protection and Electronic Documents Act, or PIPEDA.

“For example, if a police force, including the RCMP, were to outsource activities involving the use of personal information to a private commercial company, then those activities could potentially be subject to PIPEDA, including services related to facial recognition technologies,” the department said.

Quebec Provincial Police also has a contract with Idemia, but has not disclosed exactly how it uses the company’s technology.

In an emailed statement, police said its “automatic facial matching system is not used to check people’s identities. This tool is used in criminal investigations and is limited to data sheets of persons whose fingerprints have been taken under the Criminal Identification Act.” “

Artificial intelligence management expert Ana Brandusescu says Ottawa and the nation’s police forces have failed to heed calls for better governance, transparency and accountability in the procurement of facial recognition technology.

“Law enforcement is not listening to scientists, civil society experts, people with lived experience and people who have been directly harmed,” she said.


This report by The Canadian Press was first published June 30, 2024.