close
close

As police increasingly use facial recognition technology, calls for regulations grow | Fraser Valley Today

“I think the fact that different police forces are taking different steps raises concerns (about) inequality and the treatment of people across the country, but (it) also underscores the continuing importance of taking some type of action at the federal level,” she said. he said.

Facial recognition systems are a form of biometric technology that uses artificial intelligence to identify people by comparing images or video of their faces — often captured by security cameras — with existing images in databases. The technology has been a controversial tool in the hands of police.

In 2021, the Office of the Privacy Commissioner of Canada found that the RCMP had breached privacy laws by using the technology without the public’s knowledge. That same year, Toronto police admitted that some of their officers had used facial recognition software without informing their boss. In both cases, the technology was provided by the American company Clearview AI, whose database consisted of billions of images downloaded from the internet without the consent of the people whose photos were used.

Last month, York and Peel police in Ontario said they had begun deploying facial recognition technology from French multinational Idemia. In an interview, York police officer Kevin Nebrija said the tools “help speed up investigations and identify suspects faster,” adding that in terms of privacy, “nothing has changed because security cameras are everywhere.”

And in neighboring Quebec, Montreal police chief Fady Dagher says police forces will not introduce such biometric identification tools without a debate on issues such as human rights and privacy.

“It’s going to take a lot of discussion before we start thinking about implementing it,” Dagher said in a recent interview.

Nebrija stressed that the department has consulted with the Ontario Privacy Commissioner to determine best practices, adding that the images police will obtain will be “obtained in accordance with law” — either with the cooperation of the owners of the security cameras or by obtaining a court order for the images.

And while York police have urged officers to turn to legal authorities, Kate Robertson, a senior researcher at the University of Toronto’s Citizen Lab, says Canadian police have long done the opposite.

Robertson said that since revelations about Toronto police’s use of Clearview AI in 2019-2020, she is “still not aware of any police in Canada that has obtained prior approval from a judge to use facial recognition technology in their investigations.”

Obtaining court approval, usually in the form of an order, is the “gold standard for privacy protection in criminal investigations,” according to Robertson. This ensures that facial recognition, if used, is appropriately balanced with the rights to freedom of expression, freedom of assembly and other Charter rights.

While the federal government has no jurisdiction over provincial and municipal police forces, it could amend the Criminal Code to include legal requirements for facial recognition software, in the same way it updated the law to include voice recording technologies that could be used for supervision.

In 2022, federal, provincial and territorial heads of Canada’s privacy commissions called on lawmakers to establish a legal framework for the appropriate use of facial recognition technology, including empowering independent oversight bodies, prohibiting mass surveillance and limiting the length of time images are stored in databases.

Meanwhile, the federal Department of Economic Development said Canadian law “may potentially” regulate companies’ collection of personal data under the Personal Information Protection and Electronic Documents Act, or PIPEDA.

“For example, if a police force, including the RCMP, were to engage a private commercial company to perform activities that utilize personal information, then those activities could potentially be subject to PIPEDA, including services related to facial recognition technologies,” the department said.

The Quebec provincial police also have a contract with Idemia, but would not say exactly how it uses the company’s technology.

In an emailed statement, the police said its “automated facial matching system is not used to verify the identity of individuals.” The tool is used in criminal investigations and is limited to data sheets of individuals fingerprinted under the Offenders Identification Act.

AI management expert Ana Brandusescu says that Ottawa and the country’s police forces have failed to heed calls for better governance, transparency and accountability in the procurement of facial recognition technology.

“Law enforcement agencies are not listening to scientists, civil society experts, people with life experience and people who have been directly affected,” she said.

This report by The Canadian Press was first published June 30, 2024.

Joe Bongiorno, Canadian Press