Facial Deletion: The Everalbum “Course Correction” Should Scare Privacy Violators

By: Tallman Trask

For years, privacy rights violators have paid fines and been the subject of consent decrees. They’ve entered into settlement agreements with enforcers, agreed to change behaviors, and sometimes even been required to notify users or consumers of the privacy violations. The Federal Trade Commission’s (FTC) recent Everalbum settlement, however, offers regulators a new tool, and it is one companies ought to be very wary of: unlike previous settlements where companies simply paid a penalty and were able to keep hold of the products and algorithms developed through their privacy violations, the settlement here requires Everalbum to delete not only data they collected as part of a violation, but also to delete any “affected work product” created using that data.

Everalbum operated Ever, a free photo storage app, until August of 2020, when the company shut down the app. At its core, Ever served a simple purpose: users could upload, store, and organize photos to the company’s cloud storage. Over time, however, the app developed a hidden secondary purpose, and was used by Everbaum to develop and train a commercial facial recognition product the company designed “specifically for mission-critical applications” and offered to military and law enforcement buyers. The company’s software apparently worked well and achieved a high degree of accuracy in testing. According to the FTC’s complaint, Everalbum utilized user photos without first obtaining express consent. Problematically, Everalbum’s use of facial recognition may have even conflicted with the company’s own policies. The American Civil Liberties Union (ACLU) of Northern California (where Everalbum is based) called the use of user photos “an egregious violation of people’s privacy” and users described Everalbum’s use of their photos to create facial recognition products as “a huge invasion of privacy.”

The Everalbum settlement requires the company to provide notice and obtain affirmative consent before availing themselves of users’ biometric information in the future. Similarly, it prohibits future misrepresentations about the company’s data collection practices and imposes consent monitoring and recordkeeping requirements. While the settlement does not impose a fine against Everalbum, it requires the company to delete “affected work product” created through the alleged privacy violations highlighted in the complaint. Here, that means Everalbum must destroy years of work on facial recognition software where that work was based on photos uploaded by users who were unaware their photos were being used to develop facial recognition products and including the results of company’s efforts to train their facial recognition software.

The deletion requirements in the Everalbum settlement are novel; prior settlements have imposed fines without requiring deletion. For example, the FTC’s 2019 settlement with Facebook imposed a record $5 billion fine, but did not require Facebook to delete or remove portion of their software built through or  by using information gathered through the improper treatment of user data. Similarly, the FTC’s 2019 settlement with Google allowed the company to keep products it had built using data improperly collected in violation of children’s privacy rules. Praising a portion of the Everalbum settlement, former FTC Commissioner Rohit Chopra described the deletion requirement as a “course correction” from the FTC’s prior privacy settlements which allowed violators to keep products developed through privacy violations.

Beyond a simple “course correction” however, the Everalbum settlement is a new tool in the regulatory toolbox. While fines can serve as a powerful deterrent, they ought not be the only tool available to regulators. Where fines are the only available means, the system becomes little more than a pay-to-violate model. At least one commentator has suggested that the Everalbum settlement provides a roadmap for encouraging Big Tech to care more about privacy, and that very well may be true. However, larger impact may be felt by smaller technology companies and others who collect data, particularly where companies are subject to a wide range of state-level and international privacy regimes. An earlier stage company with a smaller user base may, for example, be likely only to face small fines if caught violating user privacy under the framework of the Facebook and Google settlements, but the Everalbum framework could result in deletion requirements which impact a smaller company’s entire range of product offerings. That is, while the earlier settlements may have wiped out bank accounts, the algorithm deletion framework of the Everalbum settlement could wipe out companies. And that is a tool companies and privacy officers should be very afraid of seeing pointed their way.

Leave a comment