Clearview AI: The Secret Company May End Confidentiality As We Know It |
According to the New York Times, hundreds of U.S. law enforcement agencies, including the FBI, Clearview AI, are using a new facial recognition application to compare images downloaded from a database, extracted from the media with three billion photos. Social media and other sites.
With the new Clearview AI app, users can take a photo of a person and upload it to the app. The app links this person to the person’s public picture and uses a link to where the photo was originally to show these similar photos.
Publicly available photos of people will be stored in databases collected by Clearview AI on Facebook, Instagram and Twitter as well as on Venmo, YouTube, job sites, educational sites and millions of other sites.
Traditionally, law enforcement has used facial recognition software to search for government images first, for example B. After facial photos and driver license photos. However, according to the New York Times, Clearview AI exceeded all expectations for products designed by the U.S. government or Silicon Valley giants.
The Clearview AI database size helps keep the law enforcement database small, as the FBI's Passport and Driving License database, one of the largest, has more than 641 million photos of US citizens.
The report states that the code for the application can be used with augmented reality glasses so that anyone who uses the application can identify all the people shown, and Clearview AI can be used to identify parties and activists in a random community, because not only names and addresses are given, but also names and addresses. You know the people who know you.
According to the application's website, this technology is a new research tool that law enforcement agencies use to identify criminals and victims. Hundreds of criminals were arrested and innocent people expelled. And help identify crime victims.
Only law enforcement agencies seem to have the right to request access to the app, and the Clearview AI startup has informed the New York Times that more than 600 law enforcement agencies have started using the app last year, but for security reasons as agreed by some companies. ,
The company said that the app finds matches in 75% of cases. However, since the app has not been tested by independent agencies, it is not known how many fake matches, and law enforcement agencies are said to be taking: Sensitive images of servers whose security features have not been tested.
Data protection professionals warn against using facial recognition databases, such as those provided by Clearview AI, and many cities and states in the United States that prevent the police and government from using facial recognition technologies.