Clearview AI Finally Participates in a Federal Accuracy Test


Clearview AI scraped more than 10 billion photos from the public internet to create a facial recognition tool that it markets to law enforcement to identify unknown people. Critics said the company’s product was illegal, unethical and untested. Now, more than two years after law enforcement first started using the company’s app, Clearview’s algorithm — which allows it to match faces to photos — has been subjected to a third-party test for the first time. It performed surprisingly well.

In a field of more than 300 algorithms from more than 200 face recognition providers, Clearview has ranked among the following: top 10 for accuracy, along with NTechLab of Russia, Sensetime of China and other more established outfits. But Clearview’s testing reveals how accurate its algorithm is at accurately matching two different photos of the same person, not how accurate it is at finding a match for an unknown face in a 10 billion database.

The National Institute of Standards and Technology, or NIST, a unique federal agency that is also a scientific laboratory, Face Recognition Vendor Tests every few months. There are two versions of the test, one for verification – which is the kind of face recognition one can use to unlock their smartphone – and one for the one-to-many type 1:N search. by law enforcement to identify someone by looking at a large database. Strangely enough, Clearview presented its algorithm for the previous test instead of the second test its product was built to do.

Hoan Ton-That, CEO of Clearview AI, described the results as a “perfect validation” of the company’s product. He also said the company will “send shortly” the one-to-many test.

NIST has been testing the accuracy of facial recognition providers since 2000, but participation is voluntary and testing is not required for government agencies to purchase the technology. While its accuracy has never been verified by NIST, Clearview AI claims thousands of local and state police departments are clients; soon Report from the State Accountability Office He also noted that it is used by a number of federal agencies, including the FBI, Secret Service, and the Department of the Interior.

Clearview AI has been sued in state and federal court in Illinois and Vermont for collecting photos of people without their permission and subjecting them to facial recognition searches. The company has also been hacked from other vendors, as reported by insiderThose concerned that the controversy surrounding Clearview AI will cause problems for the facial recognition industry as a whole.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *