Virtual Try-On Technology: Practical Guidance to Mitigate Biometric Privacy Liability Risk
This is the second article in a two-part series analyzing how retail brands can mitigate their liability exposure in connection with today’s ever-increasing mix of biometric privacy laws while using virtual try-on technology. Part one discussed the expanding biometric privacy liability risks associated with the use of today’s increasingly popular virtual try-on features. Part two provides tips and strategies for brands that currently use, or are contemplating the use of, virtual try-on features to maximize effectiveness while minimizing their potential biometric privacy liability exposure.
Retail brands that utilize facial recognition-powered virtual try-on technology have recently become a primary target for bet-the-company biometric privacy class action lawsuits brought under the Illinois Biometric Information Privacy Act (BIPA). The Federal Trade Commission (FTC) has also quickly emerged as a sizeable liability threat for brands that utilize this technology, as the FTC has made policing facial recognition a top priority of the agency for the foreseeable future.
At the same time, additional liability exposure exists on the horizon for brands that offer virtual try-on features to shoppers, as a flood of new legislation and regulation governing the use of facial recognition is likely to be enacted across the country in the near future.
Taken together, brands should take proactive steps to build out their biometric privacy compliance programs in order to mitigate their current and anticipated liability exposure to the greatest extent possible. Fortunately, there are several key best practices that brands can implement to mitigate these risks in connection with the use of virtual try-on technology.
Early Involvement of Biometric Privacy Counsel
As a preliminary matter, a vital initial consideration for brands is to consult with experienced biometric privacy counsel well before implementing or deploying any type of virtual try-on feature that utilizes facial recognition technology. Doing so is critical to ensuring compliance with today’s constantly-evolving biometric privacy legal landscape.
Arbitration Agreements & Class Action Waivers
Online terms containing arbitration agreements and class action waivers has developed into one of the strongest liability risk mitigation measures available to brands to shield themselves from biometric privacy class action litigation. Utilized properly, binding consumer arbitration agreements and class action waivers can serve as a powerful tool to force class action lawsuits out of court and into individual arbitration, and can be deployed at the outset of litigation to procure an early departure from costly BIPA and other types of biometric privacy class lawsuits.
The enforceability of online consumer arbitration agreements turns on two primary issues: (1) the language of the arbitration agreement; and (2) how the online terms containing the arbitration agreement are presented to shoppers on the brand’s webpage or mobile app. Brands must ensure that they satisfy the applicable requirements relating to both issues to ensure the enforceability of their online arbitration agreements if they ever find themselves in court facing a biometric privacy class action lawsuit.
First, careful attention must be given to getting the language of the arbitration agreement right. Importantly, brands should not leave their ability to compel arbitration to chance by simply copying and pasting boilerplate arbitration and class action waiver language into their online agreements. Rather, brands must meticulously tailor their arbitration and class action waiver provisions to the specific, individualized intricacies and circumstances unique to their own operations to maximize their power to force arbitration if the need ever arises.
At the same time, careful consideration must be given as to how the brand’s online terms containing arbitration and class action waiver provisions are presented to website or mobile app users, as inadequate presentation of online terms may derail a brand’s ability to utilize even the most expertly drafted arbitration agreement. Here, online terms must provide users with “clear and conspicuous” notice that their use of the brand’s virtual try-on feature constitutes assent to the brand’s arbitration agreement.
From a broader perspective, brands should also make a concerted effort to be as transparent as possible regarding the collection and use of facial template data in connection with their virtual try-on tech. Brands can significantly limit their liability exposure by placing an emphasis on ensuring that relevant information regarding their facial recognition practices is offered to users of their virtual try-on tech at each stage of the biometric data lifecycle.
In addition to providing notice, brands must also obtain written consent from all virtual try-on users relating to the brand’s use of facial recognition technology before any facial template data is captured through the brand’s virtual try-on tech. This consent should permit both the brand’s collection and use of the user’s facial template data, as well as the ability of the brand to share and disclose this data to third parties for purposes relating to the operation of its virtual try-on feature.
Like the mechanism for providing notice, brands can obtain this required written consent using a pop-up banner designed to appear when the user first lands on the brand’s virtual try-on feature page. In addition to including the necessary language needed to obtain consent, this pop-up banner must also require the user to take an affirmative action to expressly signify consent to the collection and use of his or her facial template data, such as through clicking an “I Consent” box placed inside the banner. Importantly, brands should bar users from being able to access the virtual try-on feature itself until it has obtained affirmative consent from the user.
Brands must implement data security measures to protect all facial template data that is captured, possessed, and stored through its virtual try-on technology from improper disclosure, access, or acquisition. Specifically, these security safeguards must: (1) satisfy the reasonable standard of care applicable to the brand’s given industry; and (2) protect facial template data in a manner that is the same or more protective than the manner in which the brand protects other forms of sensitive personal information.
Explicit Prohibition on Selling or Otherwise Profiting from Facial Template Data
Lastly, brands must implement and enforce a strict ban that prohibits the brand, its employees, and any vendors from selling or otherwise profiting from users’ facial template data. In particular, this prohibition should address the brand’s use of collected facial template data for purposes of improving or enhancing the entity’s virtual try-on software or any of its other facial recognition products or technologies.
Importantly, while there might be significant interest—especially from the brand’s in-house marketing team—in using this facial template data to enhance the brand’s virtual try-on software, activities of this nature should be avoided at all costs, as doing so could open up the brand to significant liability exposure in connection with both BIPA class action litigation and FTC enforcement actions. In fact, the ill-advised use of facial template data to enhance its internal facial recognition algorithms was one of the primary reasons that photo app developer Everalbum, Inc. was the first entity to find itself on the receiving end of an FTC enforcement action stemming from its improper facial recognition practices.
Virtual try-on is not a new concept. With that said, recent advancements in facial recognition and augmented reality have transformed this technology into a powerful tool for retail brands to directly connect shoppers with their products through their mobile devices. As a result, a wide swath of brands have turned to virtual try-on features to boost customer engagement and drive revenue growth. As time progresses and shopping migrates further online, the number of retailers who utilize virtual try-on technology will continue to grow at a rapid rate.
At the same time, brands currently face sizeable liability exposure stemming from the use of virtual try-on features which operate with the help of facial biometrics as a result of sharp increase in BIPA class action lawsuits taking direct aim at brands that utilize this technology. That liability exposure will only continue to grow as cities, states, and Congress seek to impose rigorous mandates and limitations on the use of facial recognition and the FTC becomes more active in clamping down on the misuse of consumers’ facial template data.
As such, brands that are currently using—or contemplating the use of—virtual try-on technology are strongly encouraged to take a proactive stance and build out biometric privacy compliance programs that encompass the practices and principles discussed above.
“Virtual Try-On Technology: Practical Guidance to Mitigate Biometric Privacy Liability Risk,” by David J. Oberly was published in Legaltech news on July 7, 2021.
Reprinted with permission from the July 7, 2021, edition of Legaltech news© 2021 ALM Media Properties, LLC. All rights reserved. Further duplication without permission is prohibited.