The BR Privacy & Security Download: January 2023
What We’re Watching in 2023
Happy New Year from the BR Data Privacy and Security Download! 2022 was a busy year for data privacy and security. State and federal regulatory agencies flexed their enforcement muscle, we counted down to new comprehensive state laws, and the U.S. Supreme Court overturned Roe v. Wade, which raised concerns about the use of personal data to enforce abortion restrictions. With 2023 promising to be just as busy, here are the top issues companies will be grappling with this year:
1. Building State Privacy Compliance Programs
The California Privacy Rights Act (“CPRA”) and Virginia Consumer Data Protection Act (“VCDPA”) came into force on January 1, 2023. Those states will be joined by new comprehensive laws in Colorado, Utah, and Connecticut in 2023. Unfortunately for companies seeking to build or modify compliance programs, the final requirements for the CPRA and the Colorado Privacy Act (“CPA”), which will be effective July 1, 2023, are still in flux. The California Privacy Protection Agency won’t release a final draft of the first part of its final CPRA rules until late January or February, and the Colorado Attorney General is still working on finalizing CPA rules as well.
Businesses subject to these laws will be spending 2023 building or modifying their privacy compliance programs to comply with numerous diverse requirements, including (1) providing appropriate notice at collection, (2) providing appropriate opt-out mechanisms for digital tracking, (3) responding to data rights requests, and (4) modifying service provider and customer contracts, among other requirements.
2. Increased Scrutiny on “Commercial Surveillance”
The Federal Trade Commission (“FTC”) published an Advance Notice of Proposed Rulemaking (“ANPR”) on Commercial Surveillance and Data Security, seeking to address practices relating to collecting, analyzing, and profiting from personal information collected about individuals. The FTC is not alone in seeking to address “commercial surveillance.” In 2022, the California Attorney General brought its first enforcement action under the California Consumer Privacy Act (“CCPA”) against Sephora Inc. for violations of the CCPA’s prohibitions on “sales” of personal data through the sharing of personal information with third-party digital advertising and website analytics providers using cookies on Sephora’s website. Under both California’s and Colorado’s comprehensive privacy laws, businesses are required to respond to automated opt-out mechanisms sent to websites by a consumer’s web browser, including the Global Privacy Control.
Companies will need to coordinate their privacy compliance and web and digital marketing functions to ensure they are keeping up to date with evolving legal, technical, and operational limitations on online tracking technologies.
3. Trans-Atlantic Data Transfers
In the wake of the European Court of Justice Schrems II decision in 2020 that invalidated the EU-U.S. Privacy Shield Framework, the European Centre for Digital Rights non-profit group, known as NOYB, filed 101 complaints against EU controllers alleging the controllers’ use of common website analytics and digital advertising tools provided by U.S. companies transferred personal data to the U.S. in violation of the EU’s General Data Protection Regulation (“GDPR”). Data protection authorities, or relevant lead supervisory authorities, of Austria, France, Norway, Liechtenstein, and the Netherlands have found the use of these tools for purposes of collecting data about EU residents to be unlawful under the GDPR. The logic of the decisions calls into question whether any personal data could be transferred from the European Economic Area to the U.S. without end-to-end encryption.
Notwithstanding those decisions, the EU and U.S. continue to work on a Privacy Shield replacement. At the end of 2022, the European Commission published a draft adequacy decision for the EU-U.S. Data Privacy Framework. If ultimately approved, the framework may provide a new legal mechanism for the trans-Atlantic transfer of personal data. However, privacy campaigner, Max Schrems, has already vowed to challenge the new framework. Global companies will need to see if the new framework will withstand legal challenges while continuing to monitor data protection authority decisions for potential impact on international operations.
4. Regulatory Agencies Focus on Sensitive Health and Location Data
The U.S. Supreme Court’s decision in Dobbs v. Jackson Women’s Health Organization, which overturned the federal right to abortion, has led to significant concern that various types of personal data may be used by law enforcement agencies and private individuals to investigate or prosecute state abortion laws. In fact, there have been cases where law enforcement has sought geolocation data, electronic communications, and search or purchase history to investigate alleged violations of state abortion laws. For example, in compliance with a search warrant, Facebook provided investigators with data related to chats, including images, audio and visual recordings, and message exchanges between accounts associated with a mother and daughter, as law enforcement investigated whether a fetus was miscarried or aborted.
In response, federal authorities and others have attempted to take steps to limit the sale and sharing of sensitive health and location data that could be used for these purposes. The FTC filed suit against data broker Kochava Inc. over alleged sales of geolocation data collected from devices traceable to millions of individuals, which can reveal the identities of individuals and their movements to and from sensitive locations, such as reproductive health clinics or religious places of worship. A substantially similar class action lawsuit was also brought against Kochava Inc. in California. President Biden issued an executive order directing the Department of Health and Human Services and the FTC to protect data in this context. Additional regulation of the use and sharing of health and/or location data may be forthcoming in 2023 as agencies seek to provide protections against secondary uses by law enforcement and others.
5. Shining the Light on Dark Patterns
In 2022, the FTC published its staff report, Bringing Dark Patterns to Light. The report details manipulative or deceptive online design practices used to trick or effectively steer consumer behavior into choices that would not have otherwise been made and discusses dark pattern practices across different online industries and contexts, such as children’s apps, subscriptions, and negative option marketing. At the end of 2022, the FTC settled allegations that Epic Games, the maker of the video game Fortnite, had used dark patterns to trick users into making unintentional purchases. As part of the settlement, Epic Games agreed to pay $245 million in refunds to affected consumers.
States have also taken note. Both the California and Colorado comprehensive state privacy laws contemplate dark patterns and state that consent cannot be lawfully obtained through the use of dark patterns. Companies should review the FTC report and monitor ongoing rulemaking efforts at the state level to ensure that customer and user workflows are designed to avoid the use of dark pattern techniques.
6. Don’t Forget the Basics (or Evolving Best Practices)
Both enforcement activity and guidance from regulators have served as a reminder of the need to build a compliance program that accounts for privacy and security basics, as well as shed light on how agencies view best practices. The FTC settled a number of enforcement actions for failures that included failures to train staff on how to identify and respond to phishing attacks, failure to use multi-factor authentication (“MFA”), and failure to adhere to the basic privacy principle of data minimization.
At the same time, agencies provided guidance on best practices that companies should be using currently and seeking to deploy in the future. The Cybersecurity and Infrastructure Security Agency (“CISA”) continues to urge organizations to implement MFA. Additionally, CISA recommends certain phishing-resistant MFA forms over others that are more susceptible to cyber threats. As another best practice, CISA and other federal agencies, including the Office of Management and Budget, the Department of Defense, and the National Institute of Standards and Technology, continue to encourage organizations to move toward the adoption of a zero-trust architecture and a “never trust, always verify” mindset to reduce cybersecurity risks in an increasingly mobile environment.
In a world of constantly evolving cyber threats and security requirements, companies should ensure their data security programs include basics such as data minimization and MFA, and should stay on top of regulatory authority guidance on best practices.
STATE & LOCAL LAWS & REGULATIONS
CPPA Provides CPRA Rulemaking Update
The California Privacy Protection Agency (“CPPA”) Board announced during its December 16, 2022, public meeting an anticipated late-January or early-February 2023 publication date for a final proposed set of its current draft regulations implementing the California Privacy Rights Act of 2020 (“CPRA”), which came into effect on January 1, 2023. The regulations that have been proposed to date do not represent the complete set of rules that will be promulgated by the CPPA. The CPPA must still prepare draft rules that address risk assessments, cybersecurity audits, and automated decision-making, among other topics. The New Rules Subcommittee proposed additional preliminary rulemaking, seeking public input on the topics in early 2023. The CPPA Board also discussed delegating to the Executive Director its two membership appointments to the California Children’s Data Protection Working Group (“Working Group”). The Working Group is tasked with recommending best practices regarding children’s online access under California’s Age-Appropriate Design Code Act, which is currently facing a legal challenge to its validity.
Tech Group Seeks to Invalidate California’s Age-Appropriate Design Code Act
NetChoice, LLC (“NetChoice”), an umbrella group of tech companies that includes Google, Twitter, and TikTok, filed a lawsuit seeking to invalidate the California Age-Appropriate Design Code Act (the “Act”). The Act is intended to provide heightened privacy protections for children under 18. The group alleges the Act is unconstitutional under the U.S. Constitution and the California state constitution and preempted by federal law. The complaint argues the Act pressures companies to act as censors of speech, in violation of the First Amendment, by requiring companies to prevent or mitigate content (e.g., speech) subjectively considered “harmful” or that “could harm children,” or face civil penalties. Further, the Act requires age verification for users and documentation to track which users are minors, which NetChoice argues would result in substantial risks that are unnecessary given existing federal protections. NetChoice seeks declaratory relief stating the Act is invalid and injunctive relief to enjoin its enforcement.
Enforcement of New York City Anti-AI Bias Law Postponed
The New York City Department of Consumer and Worker Protection (“DCWP”) announced that it would delay the date of its enforcement of the Automated Employment Decision Tools Law (“AEDTL”) from January 1, 2023, to April 15, 2023, due to the high volume of public comments received in connection with the law’s proposed regulations. The DCWP also plans to hold a second public hearing before finalizing the AEDTL’s implementing regulations. The AEDTL prohibits employers from using an automated employment decision tool to screen candidates or evaluate an employee for a promotion, unless the tool has been subject to a bias audit that assesses the tool’s disparate impact on individuals of any race, ethnicity, or sex category required to be reported by employers to the U.S. Equal Employment Opportunity Commission (“EEOC”) in an EEO Component 1 report. The AEDTL also requires notice to be given to the candidate or employee regarding the use of the tool.
Colorado Publishes Revised Colorado Privacy Act Rules
The Colorado Attorney General’s office published revised rules for the Colorado Privacy Act. The revisions incorporated a number of notable changes. Among the changes, the revised rules no longer require privacy notices to be organized around the purpose of processing personal data. The initial draft of the rules required a detailed description of each processing purpose accompanied by information on the personal data processed by that purpose and how personal data is sold or shared for such purpose. The revised rules also limit the requirement in the initial draft of the rules regarding refreshing consent for the processing of sensitive personal data. The initial draft of the rules had required refreshing consent on an annual basis. The revised rules now provide that the requirement to refresh consent is limited to instances where a consumer has not interacted with a controller in 12 months. The revised rules also make a number of other clarifications relating to dark patterns, data protection rights, and responding to data subject rights requests. The Colorado Privacy Act is effective July 1, 2023.
FEDERAL LAWS & REGULATIONS
FTC Provides Updated Interactive Tool for Health App Developers
The FTC released an updated Mobile Health App Interactive Tool to help industry members determine what federal laws and regulations might apply to their apps, such as the FTC’s Health Breach Notification Rule, the Children’s Online Privacy Protection Act (“COPPA”), the Health Insurance Portability and Accountability Act (“HIPAA”), the 21st Century Cures Act, the Office of the National Coordinator for Health Information Technology (“ONC”) Information Blocking Regulations, and the Federal Food, Drug, and Cosmetic Act. The tool asks high-level questions about the nature of the app, how it functions, the data it collects, and the services it provides to users. First released in 2016, the tool was created in conjunction with the U.S. Department of Health and Human Services’ (“HHS”) Office of Civil Rights (“OCR”), the ONC, and the U.S. Food and Drug Administration (“FDA”).
U.S. Senators Push for Action on Children’s Privacy and Proposed Kids Online Safety Act Revised
U.S. Senators Ed Markey (D-MA), Bill Cassidy (R-LA), Cynthia Lummis (R-WY), and Richard Blumenthal (D-CT), wrote Senate and House leadership calling for action on children’s privacy, specifically urging the passage of legislation that, at a minimum, contain the following key privacy provisions: (1) banning targeted advertising to children; (2) extending data protections to teenagers over 13; (3) instituting a federal Youth Marketing and Privacy Division at the FTC; and (4) commissioning a study of the COPPA safe harbor provisions. Relatedly, Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN) introduced an updated version of the Kids Online Safety Act. The updates include modifying language regarding the duty of care to require companies covered by the law to design their platforms with children’s best interests in mind, addressing concerns that anti-LGBTQ state attorneys general could misuse the law, as well as not requiring the companies to collect additional data from users to ascertain age.
Illinois BIPA Requires Biometric Data Policies Prior to Collection
The Illinois Second District Appellate Court in Mora v. J&M Plating, Inc. reversed a motion for summary judgment previously granted to the defendant, J&M Plating, thereby rejecting the trial court’s finding that section 15(a) of Illinois’ Biometric Information Privacy Act (“BIPA”) established no time limits by which a private entity must establish a written, publicly-available biometric data retention and destruction policy. The appellate court determined that a retention and destruction schedule must be established “prior to” possession of the data or “at the moment of possession or within a reasonable time thereafter.” J&M Plating allegedly began collecting biometric fingerprint scans from the lead plaintiff of the class action, Mora, for employee time-entry purposes in 2014 but did not establish a biometric data schedule until 2018, nearly four years after its statutory duty was triggered.
FTC Enters into Settlements with Epic Games Over COPPA Violations and Dark Patterns
The FTC announced that it settled two complaints against Epic Games, Inc., (“Epic Games”) the creator of the video game Fortnite, that alleged Epic Games violated COPPA and used “dark patterns” to trick users into making unintentional purchases. As part of the settlements, Epic Games will pay $275 million for violating COPPA rules and will pay $245 million to refund consumers for dark patterns and billing practices. The FTC complaint alleged that Epic Games had violated the COPPA rule by collecting personal information from children under the age of 13 and failing to notify parents or obtain parental consent. The FTC also alleged Fortnite settings that enabled text and voice communications among users by default harmed children by exposing children to bullying, harassment, and psychologically traumatizing issues. The $275 million penalty for COPPA rule violations is the largest monetary penalty ever for a violation of FTC rules. In a separate administrative complaint, the FTC had alleged that Epic Games used dark patterns to trick players into making unwanted purchases and let children rack up unauthorized charges without parental involvement. The FTC alleged that Epic Games had ignored more than one million user complaints and employee concerns that a “huge” number of users were being wrongfully charged. The $245 million refund amount represents the largest ever FTC administrative order.
Indiana Brings Lawsuit Against TikTok
Indiana filed a lawsuit against TikTok Inc., maker of the popular social media app, and its Chinese parent company, ByteDance Ltd., alleging that TikTok misrepresents the app’s safety to children while its algorithms “force-feed” and promote sexual content, intense profanity, or drug references to users, irrespective of age, and influence offline behavior. Separately, Indiana alleges that TikTok makes false, deceptive, and misleading statements about the risks to consumers’ personal and sensitive personal data posed by TikTok’s data security practices, particularly its data sharing practices subject to Chinese laws that mandate secret cooperation with China’s intelligence activities. TikTok is currently banned from government-issued devices in nineteen states and, under the No TikTok on Government Devices Act, among executive agencies.
OCR Settles with Dental Practice for Disclosures of Patients’ Protected Health Information
OCR announced that it settled with B. Brandon Au, DDS, Inc., d/b/a New Vision Dental (“New Vision Dental”) over the impermissible disclosure of patient protected health information (“PHI”) in response to online reviews and other potential violations of the HIPAA Privacy Rule. In November 2017, OCR received a complaint alleging that New Vision Dental impermissibly disclosed PHI, including patient names, treatment, and insurance information in response to patients’ online reviews of the practice. OCR’s investigation found potential violations of the HIPAA Privacy Rule including, impermissible uses and disclosures of PHI, and failures to provide an adequate Notice of Privacy Practices and implement privacy policies and procedures. New Vision Dental has agreed to pay $23,000 to OCR and to implement a corrective action plan that will be monitored for two years by OCR to ensure compliance with the HIPAA Privacy Rule.
INTERNATIONAL LAWS & REGULATIONS
U.S. and EU Trade ad Technology Council Releases AI Roadmap
The U.S.-EU Trade and Technology Council issued a Joint Artificial Intelligence (“AI”) Roadmap (“AI Roadmap”) intended to inform U.S. and EU approaches to risk management and trustworthy AI. The AI Roadmap endorses a risk-based approach and focuses on trustworthy AI systems. The AI Roadmap suggests steps to align U.S. and EU approaches by: (1) using shared terminologies and taxonomies; (2) promoting leadership and cooperation in international technical standards development activities and analysis and collection of tools for trustworthy AI and risk management; and (3) monitoring and measuring existing and emerging AI risks.
EU Publishes Draft Adequacy Decision for U.S.-EU Data Privacy Framework
The European Commission (“Commission”) published its draft adequacy decision for the new U.S.-EU Data Privacy Framework (“Framework”). The Framework is intended to provide a new legal basis to allow cross-border data transfers between the EU and the U.S. following the invalidation of the EU-U.S. Privacy Shield by the Court of Justice of the European Union in July 2020. The draft adequacy decision will now be presented to the European Data Protection Board, which will provide an opinion on whether the Framework is sufficient to ensure an adequate level of protection for EU personal data. If the European Data Protection Board approves the decision, it will be forwarded to a committee of EU member states and then the European Parliament for approval. Since the invalidation of the Privacy Shield, U.S. companies have primarily relied on the Commission’s standard contractual clauses as a legal mechanism for the transfer of EU personal data to the U.S.
© 2023 Blank Rome LLP. All rights reserved. Please contact Blank Rome for permission to reprint. Notice: The purpose of this update is to identify select developments that may be of interest to readers. The information contained herein is abridged and summarized from various sources, the accuracy and completeness of which cannot be assured. This update should not be construed as legal advice or opinion, and is not a substitute for the advice of counsel.