Virtual Try-On Technology: Inside the Current & Anticipated Biometric Privacy Legal Landscape
This is the first article in a two-part series analyzing how retail brands can mitigate their liability exposure in connection with today’s ever-increasing mix of biometric privacy laws while using virtual try-on technology. Part one discusses the expanding biometric privacy liability risks associated with the use of today’s increasingly popular virtual try-on features. Part two provides tips and strategies for brands that currently use, or are contemplating the use of, virtual try-on features to maximize effectiveness while minimizing their potential biometric privacy liability exposure.
Facial recognition has rapidly transformed—and enhanced—the consumer experience across numerous industries. In particular, facial recognition has had an outsized impact the retail industry, especially as it relates to the precipitous growth of virtual try-on technology now being utilized by a wide variety of retail brands.
As brands embark on the widespread adoption of virtual try-on tech, states and cities from coast to coast—as well as lawmakers in Congress—are attempting to enact strict laws regulating the use of this and other forms of facial recognition. Facial biometrics has also become a priority focus for Federal Trade Commission (FTC) privacy enforcement actions as well. Virtual try-on technology that relies on facial recognition software is also quickly emerging as the next major target of bet-the-company biometric privacy class action litigation.
To limit the expansive liability exposure that accompanies the use of facial recognition-powered virtual try-on features—while at the same harnessing the myriad of benefits that virtual try-on tech has to offer—brands are well advised to implement robust, flexible biometric privacy programs to minimize the risk of becoming the next target of a class action lawsuit or FTC enforcement action. At the same time, doing so can aid brands in maintaining ongoing compliance with today’s expanding body of law governing facial biometrics.
Virtual Try-On Technology: Try Before You Buy… Anytime, Anywhere
The current era of digitalization has rapidly moved shopping online. This shift in consumers’ buying patterns and preferences has led to the proliferation of virtual try-on features, which are now being offered by a wide variety of retailers, including fashion, eyewear, and makeup brands, just to name a few.
As the name suggests, virtual try-on features, also known as “virtual mirrors,” allow shoppers to “try on” products using their camera-equipped devices, such as home computers, tablets, or mobile phones. This technology is powered by a combination of facial recognition software and augmented reality (AR). Unlike ordinary biometric facial recognition—which is used to identify an individual or verify his or her identity—AR facial recognition utilizes a combination of facial recognition and advanced face-tracking techniques to “understand” the human face. The result is realistic products with natural colors and textures, as well as lighting and physical form, which allows consumers to contextually visualize an item and get a realistic, authentic sense of how they would look in the item before making a purchase and without having to leave home.
The benefits to retailers that offer virtual try-on features are immense. This technology significantly enhances the customer experience, offering cutting-edge realism and a seamless try-before-you-buy experience. At the same time, brands are able to reap the benefits of enhanced engagement, increased brand awareness, higher revenue, greater conversion growth, and reduced returns.
Biometric Privacy Legal Landscape
Significantly, because virtual try-on features utilize facial recognition, brands that use this technology must comply with an increasingly complex web of biometric privacy laws and regulations.
First and foremost, brands must satisfy the requirements of biometric privacy statutes that directly regulate the collection and use of a range of different types of biometric data, including facial template data. Currently, only Illinois, Texas, and Washington have such laws on the books.
Overall, Illinois’ Biometric Information Privacy Act (BIPA) is far-and-away the most stringent of these laws. Under BIPA, a private entity cannot collect or possess facial template data without first providing notice, obtaining written consent, and making certain disclosures. BIPA also contains a private right of action provision that permits the recovery of statutory damages ranging between $1,000 and $5,000 by any “aggrieved” person under the law, which has generated a tremendous amount of class litigation from consumers alleging mere technical violations of Illinois’ biometric privacy statute.
Texas’ Capture or Use of Biometric Identifiers Act (CUBI) and Washington’s R.C.W. § 19.375.020 (also known as HB 1493) also apply to the use of virtual try-on tech, but present somewhat less of a liability risk due to the fact that neither law contains a private right of action.
At the same time, several states without laws regulating biometrics are poised to enact their own biometric privacy statutes modeled after BIPA—which encompass facial recognition within their scope—in the near future. In 2021, New York and Maryland introduced “BIPA copycat” bills that closely mirror Illinois’ biometric privacy law, both of which remain pending at this time. If enacted, these laws would likely bring the tsunami of class action litigation generated by BIPA to New York and/or Maryland. And from a broader perspective, if successful these bills will likely provide further momentum for other states and cities to enact BIPA copycat laws of their own.
In addition, several other new biometric privacy laws have been added to the mix recently that directly target the use of facial recognition software. First, Portland, Oregon introduced an entirely new type of biometric privacy law with its enactment of an outright private-sector ban on the use of facial recognition, which went into effect at the start of 2021. Like BIPA, this facial recognition-focused law also contains a private right of action.
Second, New York City also enacted a new biometric privacy law of its own, which applies BIPA-like requirements to “commercial establishments.” While the NYC law applies to all types of biometrics, legislators clearly had facial recognition in mind as the primary focus of this regulation, even singling out the technology by name in the text of the law. Like its Portland counterpart, the NYC law also includes a private right of action provision.
While neither of these two new laws applies to the use of facial recognition-powered virtual try-on features, both are nonetheless directly relevant to brands that utilize virtual try-on tech for several reasons. First, these laws demonstrate that biometric privacy regulation is quickly expanding from the state level to the municipal level. Second, these laws show that legislators are particularly concerned with facial recognition and are focusing their efforts on enacting new regulation that directly and exclusively targets this specific type of biometrics. Third, these laws further illustrate the trend of legislators favoring private right of action provisions as the preferred enforcement mechanism for new legislation, which provides for much more expansive potential liability exposure as compared to those laws that provide only for administrative enforcement.
Last but not least, the Federal Trade Commission (FTC) has also developed into a sizeable liability threat for brands that utilize virtual try-on features which operate with the help of facial biometrics. Just recently, the FTC finalized the settlement of the agency’s first enforcement action specifically targeting the use of facial recognition technology.
More important than that, at the start of 2021 the FTC offered an unequivocal warning that policing facial recognition technology will remain a top priority for the agency for the foreseeable future. More recently, the FTC’s acting chair reiterated the agency’s new priority focus on facial recognition, promising to “redouble” the FTC’s efforts to identify violations in the area of facial recognition privacy and security. Combined—in addition to having to ensure compliance with the ever-increasing patchwork of laws governing the use of facial biometrics—brands that utilize virtual try-on features now also face greatly increased liability exposure from the FTC as well.
What This Means for Brands That Utilize Virtual Try-On Tech
Virtual try-on technology offers immense benefits and will continue to serve as a key tool for brands to drive revenue upward and enhance the level of customer satisfaction. With that said, as discussed above, virtual try-on technology also carries with it significant liability risks stemming from both the increasing number of biometric privacy laws and the FTC’s newfound interest in policing this technology.
Combined, brands that utilize virtual try-on technology must ensure they have the appropriate biometric privacy compliance practices and protocols in place to avoid becoming the next victim of a potentially game-changing BIPA class action lawsuit or FTC enforcement action. Fortunately, there are several key best practices that brands can implement to minimize these sizeable risks. Importantly, brands that take proactive steps to build out their biometric privacy compliance programs—especially those that may not yet be subject to any biometric privacy regulation—can also get a step ahead on the anticipated facial recognition laws that will inevitably be enacted as biometric privacy rights continue to expand across the country.
“Virtual Try-On Technology: Inside the Current & Anticipated Biometric Privacy Legal Landscape,” by David J. Oberly was published in Legaltech news on June 14, 2021.
Reprinted with permission from the June 14, 2021, edition of Legaltech news© 2021 ALM Media Properties, LLC. All rights reserved. Further duplication without permission is prohibited.