No Statute, No Problem? Case Law May Expand Boundaries of Biometric Privacy Litigation
Biometric technology—which uses unique characteristics like a user’s face, fingerprint, voice, etc. to verify an individual’s identity—has become commonplace in modern life. People the world over are accustomed to unlocking their phones with their faces; logging into their laptops with fingerprints; or even punching a time clock with an iris scan.
As these technologies have proliferated, however, so has litigation over how companies collect, store, and use this data. Nine-figure settlements by tech giants have grabbed the most headlines. But companies of all shapes and sizes face legal risks if they collect or use biometric data in their businesses.
The main sources of liability have been new state statutes passed in recent years, tailor-made to address biometric privacy issues. The most well-known is the Illinois Biometric Information Privacy Act (BIPA), which created rules and standards for the collection and use of biometric data and—crucially—created a private right of action for alleged violations. Other states have begun to follow suit; at least a dozen BIPA-like laws (some with a private right of action) have been introduced in other state legislatures in the first half of 2023 alone.
But state statutes specifically aimed at biometric privacy may not be the only concern for companies that handle biometric data. A recent California state court decision, Renderos v. Clearview AI, lays out what may be a roadmap for plaintiffs to use generic consumer-protection statutes, state constitutional law, and even the common law to recognize private rights of action for alleged privacy violations—all without a biometric statute.
Could Existing Privacy/Unfair Competition Laws Fill the Gap?
Even without biometric privacy legislation, plaintiffs have argued existing privacy and unfair competition laws are themselves sufficient to enforce biometric violations. Some judges have agreed.
In November 2022, an Alameda County Superior Court judge in California held in Renderos that common law privacy rights, the California constitution, and California’s broad Unfair Competition Law all provided BIPA-like private causes of action. Renderos found this consistent with a federal judge’s decision months earlier in In re Clearview AI, a multidistrict litigation against defendant Clearview AI. As alleged, Clearview “scraped” face data from public sources (like Facebook and Twitter) and used it to create facial recognition databases; it then sold services to customers, including police departments and national security agencies. The claims in Renderos and the Clearview MDL all concerned the alleged collection and commercialization of face-recognition data by Clearview.
These alleged actions led to claims falling into three groups: misappropriation of likeness/“publicity” claims; invasion of privacy; and unfair competition/consumer protection. The facts and causes of action alleged in these cases deserve close examination, because they may not apply in all biometric data contexts.
First, the “misappropriation of likeness” and “right to publicity” claims expressly involved use of the plaintiffs’ likenesses, as well as their rights to control the commercial use of their images. It is thus not clear whether the same logic would extend to other forms of biometric data, such as fingerprints or voice recognition. It is still unclear whether, for example, a person’s fingerprint is part of their “image” or “likeness.
The invasion of privacy claims may have broader applicability, however. In Renderos, the court (quoting the Clearview MDL) agreed “biometric information, by its very nature, is sensitive and confidential.” Thus, Renderos found the use of that information—without proper consent—could violate the California state constitution’s right to privacy. This aspect of Renderos is not so easily limited to face recognition. Plaintiffs may contest fingerprints and voices should be considered as “sensitive and confidential” as faces.
Finally, the California Unfair Competition Law claim—which gives individuals a derivative right of action for any unlawful business practice that damages them—was found to rise or fall with the merits of the other two claims, and thus, was upheld. Here, it is interesting Renderos found the plaintiffs’ damages were they were “not compensated” for use of their biometric data—a possible distinction for companies that only use biometric data internally (e.g., for identify verification), rather than for commercialization.
State Attorneys General and the FTC May Also Step-Up Enforcement Actions
Of course, biometric compliance risk is not limited to class action suits for private rights of action. Texas’ recent suit against a notable social media platform, also involving facial-recognition technology, is a stark reminder. Texas has a biometric privacy law like BIPA. And although it lacks a private right of action, it does carry steeper per-violation fines ($25,000) that the Texas attorney general has sought to enforce.
Another key wrinkle to the Texas AG’s action is it also asserts claims under Texas’ deceptive trade practices statute. That statute does confer a private right of action. If that aspect of the Texas AG’s suit is successful, it may open other companies to biometric privacy suits framed as deceptive trade practice claims—both in Texas and in other states with similar consumer-protection laws on the books.
Most recently, on May 18, the FTC released a policy statement on biometric data. That statement specifically warns the agency will bring enforcement actions relating to biometric data if companies that deploy biometric technologies make false or unsubstantiated claims regarding such technologies or fail to assess and promptly address known or foreseeable risks relating thereto.
Biometric privacy laws are a vital concern for any company that collects, handles, or uses biometric data. As more and more courts and state agencies seek to push the boundaries of biometric privacy ahead of where state legislatures, it is more important than ever for companies to stay abreast of these issues and the evolving compliance risks.
"No Statute, No Problem? Case Law May Expand Boundaries of Biometric Privacy Litigation," by Jeffrey N. Rosenthal and Andrew Schrag was published in The Legal Intelligencer on May 24, 2023.
Reprinted with permission from the May 24, 2023, edition of The Legal Intelligencer © 2023 ALM Properties, Inc. All rights reserved. Further duplication without permission is prohibited.