By: Alex Coplan
Clearview AI is a small company that has constructed perhaps the largest international facial recognition database ever created. Clearview’s AI program calculates unique indicators about our bodies—like the distance between our eyes and the shapes of our cheekbones—to create a “faceprint.” Clearview scrubs this data from social media posts, online videos, and more, amassing billions of faceprints without our consent. Similar to DNA or fingerprint information, law enforcement and other entities can use Clearview’s faceprints to identify us.
Companies should not be able to collect and store our biometric data without our consent. Last year, the American Civil Liberties Union (ACLU) sued Clearview for violating Illinois’ Biometric Information Privacy Act (BIPA). BIPA prohibits capturing people’s biometric information—such as DNA, fingerprints, or faceprints—without their consent. By using Illinois’ BIPA and California’s Consumer Privacy Act (CCPA), states can use their own biometric privacy laws to protect their citizens while they wait for the federal government to create uniform standards.
Clearview is unlike other facial recognition technologies used by authorities, which pull their photos from drivers licenses and mugshot photos. Instead, Clearview allows its user to take a photo of someone, upload it, and then see public photos of that person, along with links to where the photos were taken. Fortunately, Clearview stopped selling its product to private companies and instead only provides use to law enforcement agencies. There are at least 2,400 United States law enforcement agencies that use Clearview.
Make no mistake, there are some benefits to Clearview’s technology. Clearview has helped law enforcement solve shoplifting, identy theft, credit card fraud, murder, and child exploitation cases. It was even used to positively identify some rioters that attacked the U.S. Capitol this January.
Despite the benefits of a program like Clearview, the dangers are real. The capture and storage of faceprints leaves people vulnerable to data breaches and identity theft. The ACLU’s complaint alleges that Clearview’s technology “can invasively identify everyone at a protest, a political rally, a house of worship, a domestic violence shelter, an Alcoholics Anonymous meeting, and more.” Further, United States Senator Ed Markey summarized the complications surrounding a future Clearview data breach by asserting that “if your password gets breached, you can change your password. If your credit card number gets breached, you can cancel your card. But you can’t change biometric information like your facial characteristics if a company like Clearview fails to keep that data secure.”
In defense of widespread criticism, Clearview argues it has a First Amendment right to disseminate information that is publicly available online. Clearview claims it gathers only publicly-available photos and then uses them in a search engine that expresses Clearview’s opinion on who is in those photos. Current constitutional protections may hold that Clearview is right. Clearview may freely discuss and circulate images it finds online without violating the First Amendment. Additionally, Fourth Amendment privacy protections are not currently violated by the use of facial recognition programs on publicly available information.
Instead, Clearview violates state biometric privacy laws that prohibit private companies from capturing our faceprints without notice and consent. Currently, there is no comprehensive federal privacy law governing the collection, use, and sale of biometric data by private companies. Therefore, states are left to determine how they want to handle the situation. Illinois, through BIPA, may find the use of Clearview illegal, absent an individual’s consent. Another class action BIPA suit settled this year against Facebook, in which the social media giant will pay out approximately $550 million for its unlawful use of their facial recognition technology. Facebook sparked that lawsuit when it’s initial version of it’s Tag Suggestion tool captured faces from users, using and storing that biometric data without individual consent. Through photos published by users, Facebook’s Tag Suggestion tool scanned faces and gave suggestions about who the person in the photo might be.
Adopted in 2008, BIPA remains one of the strictest biometric data laws in the country. BIPA is an informed consent statute that protects individuals’ privacy by making it unlawful for a company to, among other things, “collect, capture, purchase, receive through trade, or otherwise obtain a person’s or a customer’s biometric identifiers” unless it first provides notice that biometric information is being stored, and informs the subject of the purpose and length of term for which the information is being collected. Further, the statute protects Illinois citizens from sale, release, trade, or profitization of their biometric information by private companies.
By failing to inform people how, when, and to what extent their biometric data will be used, Clearview has failed to obtain the necessary consent of the individuals who appear in the photos they collect. Combined with the recent ruling against Facebook, the class action suit against Clearview has a good chance of success.
To mitigate the issue, Clearview has cancelled all accounts belonging to any entity in Illinois. However, they argue they should not face an injunction prohibiting them from using current or past biometric data from Illinois residents. Pursuant to Illinois’ BIPA, they will likely have a tough time overcoming the injunction.
The fight to protect against the dangers of face surveillance technology is ongoing. Across the nation, the ACLU and other advocacy groups have been successful in implementing bans on police use of facial recognition technology. Until Congress regulates the growing popularity of these softwares, states and private entities will be left to defend themselves against the intrusions on our privacy. In the meantime, states should use Illinois’ BIPA as a model for their own biometric privacy regulations.