Publications
Article

How to Avoid Becoming the Next Major BIPA Class Action Target When Using Facial Recognition for Security and Surveillance

Biometric Update

Swift, rapid advances have fueled a proliferation of facial recognition technology—which continues to expand into new areas of public and private life. In particular, facial recognition is being increasingly relied on today by retailers and similar commercial entities for security/surveillance purposes.

At the same time, however, facial recognition has become an increasingly-popular target for class action litigation pursued under the Illinois Biometric Information Privacy Act (“BIPA”). To further complicate matters, numerous states and Congress are attempting to enact additional, stringent laws regulating the use of facial recognition technology by commercial businesses.

Taken together, all entities that use facial recognition technology today—especially those that utilize this technology for security and surveillance purposes—must ensure they have the appropriate biometric privacy compliance practices in place to avoid becoming the next target of a potentially game-changing biometric privacy class action lawsuit.

Facial recognition technology explained

Facial recognition technology involves the use of “biometrics” (i.e., individual physical characteristics) to digitally map an individual’s facial “geometry.” These measurements are then used to create a mathematical formula known as a “facial template” or “facial signature.” This stored template or signature is then used to compare the physical structure of an individual’s face to confirm their identity or uniquely identify that individual.

Legal landscape

At this time, only three states have targeted biometric privacy laws on the books that directly regulate the use of facial recognition technology.

Of those laws, Illinois’ BIPA is considered the most stringent. Under BIPA, a private entity cannot collect or store facial template data without first providing notice, obtaining written consent, and making certain disclosures. BIPA has created substantial liability exposure for companies that utilize facial recognition in their operations. This risk arises primarily because of the statutory damages made available under BIPA—ranging between $1,000 and $5,000 per violation—which can be pursued even where no actual harm or damage is sustained.

As just one example of the tremendous liability posed by BIPA, Facebook just recently settled a major BIPA class lawsuit involving the use of facial recognition technology for a staggering $650 million—but only after a judge rejected Facebook’s original $550 million offer.

Beyond Illinois, Texas and Washington have also enacted biometric privacy laws covering facial recognition technology, which impose similar requirements relating to notice, consent, and mandatory security measures.

In addition, at this time many states are currently attempting to enact biometric privacy legislation of their own, which would expand BIPA-like notice, consent, and security requirements to additional parts of the country.

Importantly, the scope of many of these bills go well beyond that of BIPA and would impose additional requirements, such as mandated pre-deployment testing and periodic employee training, as well as permitting the testing of this technology by third parties.

At the same time, federal lawmakers have also targeted facial recognition as a primary focus for nationwide regulation, which would put in place uniform requirements nationwide over the use of technology.

Most recently, in August 2020 Senators Jeff Merkley (D-OR) and Bernie Sanders (I-VT) introduced the National Biometric Information Privacy Act of 2020 (S.4400), which would impose requirements over the use of facial recognition technology (and other forms of biometrics) similar to BIPA from coast to coast.

Additional Challenges & Risks

With that said, the current (and future) challenges associated with facial recognition technology are not limited to significant legal liability exposure.

Facial recognition has recently received a significant amount of negative media coverage over potential accuracy and bias problems associated with this technology. Of particular concern is the fact that today’s technology is much less accurate in identifying people of color and women, thereby creating an enhanced risk of misidentification of minorities.

Facial recognition has also garnered a significant amount of negative publicity stemming from controversial uses of this technology. At the start of 2020, news broke regarding the practices of facial recognition startup Clearview AI, which built a massive database of facial templates of millions of individuals across the world and then sold access to its database to both law enforcement and private entities.

More recently, Rite Aid made national headlines as a result of its practice of deploying facial recognition technology for security/surveillance purposes in over 200 stores nationwide, the majority of which were found in lower income neighborhoods.

Taken together, it is clear that companies utilizing facial recognition software for security/surveillance purposes will remain under significant scrutiny for the foreseeable future.

New major biometric privacy class action litigation target

To date, the primary focus of BIPA class action litigation has been employers that utilize biometric fingerprint readers for time and attendance purposes. More recently, however, a new BIPA target has appeared on the radar of plaintiff’s attorneys: companies using facial recognition for security and surveillance purposes.

For example, Lowe’s and Home Depot have been targeted with BIPA class action lawsuits stemming from their use of facial recognition technology as part of their in-store, loss prevention surveillance systems.

Not surprisingly, Clearview AI has been named in a string of BIPA complaints stemming from the development and sale of its surveillance database. National retailer Macy’s was also named in BIPA class action lawsuit stemming from its use of Clearview AI’s database, also for surveillance and security purposes.

Most recently, Kroger found itself on the receiving end of a BIPA class action lawsuit over its collection of patrons’ and employees’ facial template data through the use of surveillance cameras located in its Illinois Mariano’s locations.

Compliance tips

Due to the rapidly increasing liability exposure associated with the use of facial recognition technology, companies using this technology or are considering doing so in the future—even if they are not subject to any regulation at this time—should not wait until new laws are passed; instead, they should take affirmative action now to put in place flexible, adaptable compliance programs that can ensure ongoing compliance with facial recognition regulation.

Fortunately, there are several actionable steps companies can take to effectively leverage facial recognition technology in a manner that satisfies their legal obligations. In particular, companies should consider the following:

  • Accuracy and bias testing: Because facial recognition software can produce results that are biased in ways that harm particular ethnic and racial groups, pre-deployment testing of facial recognition technology should be completed to ensure its effectiveness and accuracy before it is used in real-time situations.
  • Privacy policy: Develop a publicly-available, detailed facial recognition-specific privacy policy that includes, at a minimum, clear notice that facial template data is being collected, as well as additional information regarding the purposes for which facial template data is used and the company’s schedule and guidelines for the retention and destruction of this data.
  • Written notice: Provide written notice—prior to the time any facial template data is collected—which clearly informs individuals that facial template data is being collected, used, and/or stored by the company; how that data will be used and/or shared; and the length of time over which the company will retain the data until it is destroyed.
  • Written release: Obtain a signed written release/consent form from all individuals prior to the time any facial template data is collected that permits the company to collect/use the individual’s biometric data and disclose the data to third parties for business purposes.
  • Opt-out: Permit individuals to opt out of the collection of their facial template data.
  • Data security: Maintain data security measures to safeguard facial template data that satisfies the reasonable standard of care applicable to the company’s given industry and which protects facial template data in a manner that is the same or more protective than the manner in which the company protects other forms of sensitive personal information.
  • Explicit prohibitions on using technology for discriminatory purposes: Maintain an explicit policy strictly barring the use of facial recognition technology by employees, contractors, or vendors to unlawfully discriminate against individuals or groups of individuals.

Conclusion

Facial recognition technology has significantly enhanced the operations of businesses across all industries in a myriad of different ways—including with respect to security/identity fraud prevention; access and authentication; and accessibility to accounts and services.

At the same time, this technology has become an increasingly-frequent target of biometric privacy class action lawsuits, exposing businesses to tremendous potential legal liability. Moving forward, the scope of liability exposure will only increase as additional states and Washington D.C. look to impose greater regulation over the use of facial biometrics.

Consequently, companies that incorporate facial recognition technology into their operations for security/surveillance purposes or intend to do so in the future—even those located in jurisdictions where no applicable regulation currently exists—should take proactive measures to develop and implement facial recognition biometrics compliance programs that encompass the principles and practices described above.

“How to Avoid Becoming the Next Major BIPA Class Action Target When Using Facial Recognition for Security and Surveillance,” by David J. Oberly was published in Biometric Update on September 16, 2020.