The Toronto-based Black Action Defense Committee (BADC) is calling on the Toronto Police Services Board to reject any recommendation to authorize the use of face recognition technology.
In a news release issued earlier this week, BADC said the use of the technology is immoral, illegal and unethical and it will “oppose its use by police in Toronto, Ontario and Canada with all the vigor and resources and experience that BADC has acquired.”
“This is the thin edge of the wedge in taking away personal privacy that citizens have fought for in Canada and which are now guaranteed in the Charter of Rights and Freedoms,” the activist group stated.
“We will not accept Canada descending into a totalitarian or Police state,” it declared.
BADC warned that the use of this technology by police will usher in ” the tyranny that people in South America Europe and elsewhere have been running from to live in Canada.”
Canadian privacy authorities have already launched an investigation into Clearview AI, a New York-based company, to determine whether the firm’s use of facial recognition technology complies with the country’s privacy laws.
Clearview AI describes itself as a tool for law enforcement, scraping the internet for publicly available photos and using facial recognition to identify potential suspects.
However, critics in both Canada and the United States have raised concerns about the lack of consent of those searched, and the potential for misuse of the service.
Several police forces in Ontario have publicly acknowledged they have used Clearview A1’s services, including the Toronto Police.
The privacy commissioners of Canada and of the provinces of British Columbia, Alberta and Québec will jointly investigate whether Clearview A1’s practices are in compliance with Canadian privacy legislation.
The investigation was initiated as a result of media reports that “raised questions and concerns about whether the company is collecting and using personal information without consent”, said a joint statement from the commissioners’ offices.