Jessy Edwards  |  June 11, 2021

Category: Legal News

Top Class Actions’s website and social media posts use affiliate links. If you make a purchase using such links, we may receive a commission, but it will not result in any additional charges to you. Please review our Affiliate Link Disclosure for more information.

Canadian Police Broke the Law Through Use of Surveillance Software, Privacy Watchdog Says
(Photo Credit: Anne Richard/Shutterstock)

The Royal Canadian Mounted Police (RCMP) broke Canadian law when it used facial-recognition software to search through huge troves of images of innocent people, the federal privacy commissioner says.

On Thursday, privacy commissioner Daniel Therrien released a report saying the RCMP had violated the Privacy Act in its use of state-of-the-art facial-recognition software Clearview AI, Global News reported

“The use of [facial recognition technology] by the RCMP to search through massive repositories of Canadians who are innocent of any suspicion of crime presents a serious violation of privacy,” he said. 

He added that there were “serious and systemic” failings by the police to ensure compliance with privacy laws before using the software.

“A government institution cannot collect personal information from a third party agent if that third party agent collected the information unlawfully,” he wrote. 

New York company Clearview AI scrapes billions of images of people from across the internet and provides it to police forces, financial institutions and others who want to try to identify people by face.

In a related probe in February, the privacy commissioner and three provincial Canadian counterparts said Clearview AI was violating Canadians’ privacy rights, essentially conducting a mass surveillance of the people. 

They said the company violated both federal and provincial laws around personal identifiable information.

As a result of the investigation, Therrien said Clearview AI would no longer offer its services in Canada.

Meanwhile, last year, the federal court of Canada approved a $100 million discrimination class action settlement against RCMP over gender-based abuse and discrimination. 

The plaintiffs represent a Class of women who were in non-policing jobs in the Royal Canadian Mounted Police (RCMP) between 1974 and 2019. In addition, family members and spouses impacted in severe cases may also be entitled to a portion of the settlement. Click here for more information.

What do you think of RCMP’s use of the software? Let us know in the comments!


Don’t Miss Out!

Check out our list of Class Action Lawsuits and Class Action Settlements you may qualify to join!


Read About More Class Action Lawsuits & Class Action Settlements:

We tell you about cash you can claim EVERY WEEK! Sign up for our free newsletter.

  • This field is for validation purposes and should be left unchanged.

One thought on Canadian Police Broke the Law Through Use of Surveillance Software, Privacy Watchdog Says

  1. TLCGranddaughter says:

    If they are doing this, they are using DNA and biosignal software as well, the patents are out there and it is being used all over the world.

Leave a Reply

Your email address will not be published. By submitting your comment and contact information, you agree to receive marketing emails from Top Class Actions regarding this and/or similar lawsuits or settlements, and/or to be contacted by an attorney or law firm to discuss the details of your potential case at no charge to you if you qualify. Required fields are marked *

Please note: Top Class Actions is not a settlement administrator or law firm. Top Class Actions is a legal news source that reports on class action lawsuits, class action settlements, drug injury lawsuits and product liability lawsuits. Top Class Actions does not process claims and we cannot advise you on the status of any class action settlement claim. You must contact the settlement administrator or your attorney for any updates regarding your claim status, claim form or questions about when payments are expected to be mailed out.