The BR Privacy & Security Download: February 2023

The BR Privacy & Security Download

Welcome to this month's issue of The BR Privacy & Security Download, the digital newsletter of Blank Rome’s Privacy, Security & Data Protection practice. We invite you to share this resource with your colleagues and visit Blank Rome’s Privacy, Security & Data Protection webpage for more information about our team.


Flurry of Comprehensive Privacy Legislation Introduced as State Legislative Sessions Open
Comprehensive privacy legislation was introduced in at least 11 states to start 2023. Legislative proposals significantly vary in both requirements and enforcement strategies. Two competing bills were introduced in New York, one requiring consent for processing sensitive data and the other requiring consent for processing all personal data as well as requiring a signed release for processing biometric data. Kentucky’s, Tennessee’s, Iowa’s, and New Hampshire’s proposals, as well as both Indiana’s HB 1554 and SB5, provide for exclusive enforcement by the state Attorney General. New York’s, Oregon’s, and Washington’s legislative proposals, as well as all three competing proposals in the Massachusetts legislature, provide consumers with a private right of action for violations. Hawaii introduced two competing bills, one with a private right of action and one that would be exclusively enforced by the Attorney General. Mississippi’s bill provides a limited private right of action in the event of a data breach similar to the California Privacy Rights Act (“CPRA”). Legislative proposals in Massachusetts and Washington provide for GDPR-like penalty calculations, with one of Massachusetts’ bills providing for administrative fines of up to the greater of $10 million or 2percent of annual global revenue and Washington’s bill providing for civil penalties of up to the greater of $25,000 per violation or 4percent of annual revenue. Faced with the potentially increased complexity of complying with several additional state laws, businesses will likely continue to lobby Congress to take action on federal legislation.

Biometric Privacy Laws Introduced in Three States
Biometric privacy legislation was re-introduced in Maryland. Maryland failed to pass a biometric privacy law in 2022. Maryland’s proposed law would prohibit processing of biometric data without consent except in limited circumstances, require development of a public biometric data retention and destruction policy, and allow individuals to bring a private right of action for violations. If passed as currently written, Maryland’s bill would be effective on October 1, 2023. New York and Mississippi also introduced biometric privacy laws. Similar to Maryland’s proposed law and Illinois’ Biometric Information Privacy Act (“BIPA”), New York’s and Mississippi’s proposals would require businesses to obtain consent or written releases to process biometric data, make data retention policies public, and allow individuals to bring private rights of action for violations. Mississippi’s proposed legislation, if passed, would be effective on July 1, 2023, and New York’s proposed legislation would take effect 90 days after it becomes law. BIPA has fueled significant litigation in recent years and the expansion of private rights of action for violation of similar laws in other states would drive similar lawsuits against companies in those states. Companies that process biometric data and operate in these states should closely monitor how the legislation progresses and evaluate their compliance posture

California Privacy Protection Agency to Hold Public Forum on Regulations to California Privacy Rights Act
The California Privacy Protection Agency (“CPPA”), the agency tasked with enforcing the California Privacy Rights Act (“CPRA”), announced that it will hold a public forum on February 3, 2023. The public forum will cover discussion and possible action regarding the current proposed regulations implementing the CPRA. The CPPA will also be discussing the preliminary rulemaking activities for new rules on risk assessments, cybersecurity audits, and automated decision-making, which are topics the CPRA provides the CPPA authority with to draft regulations on. Previously, during the December public forum, CPPA Executive Director Ashkan Soltani stated the final regulations will likely be released in late January. Under that timeline, with a 30-day review by the California Office of Administrative Law, the regulations would take effect around April.

Third Version of Colorado Privacy Act Proposed Rules Published
The Colorado Attorney General released updated proposed draft rules to the Colorado Privacy Act (“CPA”), amending the previous version and incorporating comments received through January 18, 2023. The updates include new or updated defined terms, clarity on controller obligations for consumer opt-out requests, including a new “clear and conspicuous” notice requirement involving profiling related to certain decision-making categories, and modifications to controller obligations for consumer requests to correct inaccuracies. A formal CPA rulemaking hearing was held on February 2, 2023.

Virginia Introduces Amendment to Consumer Data Protection Act
The Virginia Legislature has introduced an amendment to the Virginia Consumer Data Protection Act (“VCDPA”) to strengthen privacy protections for children. The bills (HB 1688/SB 1026) amend the definition of “child” to include all individuals under the age of 18, and require operators (a newly defined term) to obtain verifiable parental consent before registering any child with the operator’s product or service or before collecting, using, or disclosing such child’s personal data. The amendments also prohibit a controller from knowingly processing the personal data of a child for purposes of (i) targeted advertising, (ii) selling the personal data, or (iii) profiling in furtherance of decisions that produce legal or similarly significant effects concerning the child. The bill further clarifies that the VCDPA applies to “operators” rather than “persons.” An operator is defined as “a natural or legal entity that conducts business or produces products or services that are targeted to consumers and that collects or maintains personal data from or about such consumers.

My Health, My Data Act Introduced in Washington State
Washington State introduced HB 1155, the My Health, My Data Act (the “Act”). The Act, which was introduced in part at the request of Washington Attorney General Bob Ferguson, is intended to provide Washingtonians more control over their health data in the wake of the Dobbs Supreme Court decision. Under existing law, patients’ health data is typically protected by the federal Health Insurance Portability and Accountability Act (“HIPAA”), which applies to personal health information collected and maintained by covered entities (i.e., healthcare providers and payers). The health data protections proposed by the Act include (i) additional notice, disclosure, and consumer consent requirements; (ii) a new consumer right to delete; (iii) a prohibition against selling; and (iv) a prohibition on the use of virtual perimeter “geofences” around facilities that provide health care services. The Act would be enforceable under Washington’s Consumer Protection Act against non-HIPAA-covered entities (e.g., health apps, websites, and certain medical facilities) that process Washington consumers’ health data.

West Virginia Introduces Children’s Privacy Bill for Persons under 18
West Virginia introduced House Bill 2460 (“HB 2460”), which would complement and expand existing federal children’s privacy protections under the Children’s Online Privacy Protection Act (“COPPA”) to include persons under the age of 18, rather than the current standard of under the age of 13. HB 2460 would require the state’s Attorney General to propose rules for the protection of children’s online privacy by March 1, 2023, including those involving data collection and use notifications, verifiable consent requirements, and non-discrimination provisions. Violations of HB 2460 would be enforceable by the Attorney General through injunctive relief and/or civil penalties of up to $5,000 per violation.

Oregon and Connecticut Introduce Age Appropriate Design Code Laws
Oregon and Connecticut are following in the footsteps of California, introducing their own Age-Appropriate Design Code. Oregon introduced SB 196, which requires businesses that provide online products, services, or features that a child is reasonably likely to access to identify, evaluate, and mitigate risks to the child from the online product, service, or feature and to complete a data protection impact assessment. Like the California law, the Oregon bill also requires default privacy settings that offer the maximum level of privacy and clearly and concisely worded privacy policies and terms of service and establishes a task force to help study how children access, use, and are affected by online products, services, and features as well as methods for mitigating risks. Connecticut introduced HB 6253, which proposes that Connecticut’s general statutes be amended to establish an age-appropriate design code. A child under both the Oregon and Connecticut bills is defined as an individual under the age of 18.


NIST Announces Intent to Revise Cybersecurity Framework
The National Institute of Standards and Technology (“NIST”) has published a concept paper to seek additional input on the structure and direction of its Cybersecurity Framework (“CSF”) to develop a CSF 2.0. The concept paper outlines the potential changes that NIST is considering, including increasing international collaboration and engagement, adding implementation examples for CSF subcategories, developing a CSF profile template, adding a new “govern” function, expanding coverage of the supply chain, and updating its Performance Measurement Guide for Information Security (SP 800-55r2). NIST has stated that the CSF 2.0 will retain the current level of detail and remain technology-and vendor-neutral but reflect changes in cybersecurity practices. Comments to the concept paper are due by March 3, 2023. The changes proposed in the concept paper will also be discussed at the second CSF 2.0 virtual workshop on February 15, 2023, and during CSF 2.0 in-person working sessions on February 22-23, 2023.

FDA Given New Authority to Establish Medical Device Security Requirements
In December 2022, President Biden signed a $1.7 trillion omnibus appropriations bill into law, which included measures that give the U.S. Food and Drug Administration (“FDA”) new authority to establish medical device security requirements for manufacturers. More specifically, the law provides the FDA with $5 million and the authority to ensure all new medical devices brought to market are designed with security in mind. This means the FDA may pursue requirements for all medical device submissions to include a software bill of materials and adequate evidence to demonstrate the product can be updated and patched.

FCC Publishes Notice of Proposed Rulemaking for New Breach Reporting Requirements
The Federal Communications Commission (“FCC”) proposed updated data breach reporting rules for telecommunications service providers and published a Notice of Proposed Rulemaking (“Notice”). The proposed rules seek to strengthen current notification rules to customers and federal law enforcement of breaches implicating customer proprietary network information (“CPNI”). The proposed rules expand the definition of “breach” to include inadvertent disclosures of CPNI and provide minimum requirements for breach notifications sent to customers. The rules also propose new governmental notice obligations that would require providers to also notify the FCC, in addition to federal law enforcement, “as soon as practicable.” The current “within 7 business days of discovery” deadline to notify federal law enforcement and consumers in the event of a breach would be removed. Comments to the Notice are due on February 22, 2023, and reply comments are due on March 24, 2023.

NIST Released AI Risk Management Framework
The National Institute of Standards and Technology (“NIST”) released its Artificial Intelligence Risk Management Framework (“AI RMF 1.0” or the “Framework”). The Framework is intended to be a living document for voluntary use by organizations to guide the design, development, deployment, or use of AI systems. NIST also published supporting resources, including a draft companion Playbook and Roadmap. The two-part Framework describes, first, how organizations can frame AI-related risks and the characteristics of trustworthy AI systems; and second, the actual framework and its four core functions—govern, map, measure, and manage. The current draft Playbook includes suggested actions, references, and documentation guidance based on best practices and research insights. NIST is accepting feedback on the draft Playbook through February 27, 2023, and a revised version will be posted in the NIST Trustworthy and Responsible AI Resource Center in Spring 2023.


BIPA Claims against Hosting Infrastructure Vendor Partially Dismissed
The U.S. District Court for the Northern District of Illinois dismissed two claims in the putative class action, Jones v. Microsoft Corp., over claims that Microsoft, Corp. (“Microsoft”) failed to provide notice to or obtain consent from plaintiffs, as required by Illinois’ Biometric Information Privacy Act (“BIPA”), and improperly disclosed plaintiffs’ collected data. The claims arise from Paychex’s use of Microsoft’s Azure platform to store fingerprint scans collected from Chicago Marriott Suites (“Marriott”) employees, on Marriott’s behalf, for employee-timekeeping purposes. Paychex contracted with Microsoft to utilize Azure’s cloud storage. Neither Chicago Marriott nor Paychex are part of the suit. In the dismissal order, the judge determined that plaintiffs failed to establish that Microsoft took an “active step” to obtain or receive plaintiffs’ biometric data, had a direct relationship with plaintiffs that would permit it to obtain consent, or disseminated the data.

Court Weighs in on What Constitutes Possession of Biometric Data
The Illinois appellate court in Barnett et al. v. Apple, Inc. affirmed the trial court’s dismissal of the putative class action brought against Apple, Inc. (“Apple”) under Illinois’ Biometric Information Privacy Act (“BIPA”), for Apple’s failure to provide consumers a written biometric data retention-and-destruction policy before scanning and “possessing” users’ biometric information through Apple’s Face ID and Touch ID software (collectively, “ID Software”). Although BIPA does not define “possession,” the appellate court determined that consumers’ use of Apple’s ID Software did not automatically constitute Apple’s “possession” of biometric data under BIPA. Rather, Apple’s ID Software functioned as elective tools that allowed users to capture and store their own biometric information on their own devices. The panel of appellate judges noted that the facts alleged by the Barnett plaintiffs seemed to suggest that “Apple designed [the ID Software] features almost with the express purpose of handing control to the user.”


HHS Settles with Life Hope Labs HIPAA Right of Access Violation
The U.S. Department of Health and Human Services (“HHS”) Office of Civil Rights (“OCR”) announced a settlement with Life Hope Labs, LLC (“Life Hope Labs”) concerning a potential violation of the Health Insurance Portability and Accountability Act (“HIPAA”) Privacy Rule’s right of access provision. The complaint filed with OCR alleged that Life Hope Labs provided a personal representative with a copy of her deceased father’s medical records seven months after she had requested them, violating the HIPAA Privacy Rule’s requirement to provide access within thirty calendar days. Life Hope Labs agreed to pay $16,500 and to implement a corrective action plan that includes two years of monitoring by OCR.

California AG Announces CCPA Enforcement Sweep Focusing on Mobile Apps
California Attorney General Rob Bonta announced an investigative sweep focusing on mobile apps that fail to comply with the California Consumer Privacy Act (“CCPA”). The enforcement sweep focused on mobile apps in the retail, travel, and food service industries that allegedly fail to comply with consumer opt-out requests or do not offer any mechanism for consumers who want to stop the sale of their data. The Attorney General’s enforcement sweep also focused on businesses that failed to process consumer requests submitted by an authorized agent, including requests submitted by Permission Slip, a mobile application developed by Consumer Reports that allows consumers to send requests to opt-out and delete their personal information.

DC Attorney General Settles Dark Pattern Case Involving Alleged Deceptive Geolocation Tracking
Washington, D.C., Attorney General (“AG”) Karl Racine announced Google’s agreement to pay a $9.5 million penalty to settle claims that Google manipulated consumers to falsely believe they could control what data Google collected, stored, or used about them, as well as Google’s use of “dark patterns” to undermine consumers’ informed choices. For example, The AG alleged that Google’s support page states that users can turn off location history to limit Google’s access to users’ geolocation data, but in reality, some Google apps continued to automatically track and store this data unless users turn off Google’s “Web and App Activity” setting, which does not specifically reference location data. Additionally, the AG alleged that users may only delete Google’s location markers through a painstaking manual process. Under the settlement agreement, Google must also change how it informs users about its data collection, storage, and use practices involving users’ geolocation data.


Irish DPC Adopts Final Decision on Meta’s Legal Basis for Serving Personalized Ads
Ireland’s Data Protection Commission (“DPC”) announced final decisions in two cases involving Meta’s Facebook and Instagram that find Meta’s processing of personal information to serve personalized ads cannot rely on performance of a contract as a legal basis. The DPC also found that Meta had breached its obligations in relation to transparency of processing because information regarding the legal basis used by Meta to process personal data was not clear. As part of the decision, the DPC fined Meta a total of €390 million. The DPC has given Meta three months to bring its data processing operations into compliance with the EU’s General Data Protection Regulation (“GDPR”).  The investigations resulted from complaints filed by privacy rights group NOYB on May 25, 2018, the day the GDPR took effect.

CNIL Announces Fines Relating to Inability to Refuse Cookies as Easily as Accepting Them
The French Data Protection Authority (“CNIL”) announced it had imposed two penalties in unrelated cases involving companies’ failure to provide individuals with mechanisms to refuse cookies that were as easily used as the mechanisms to accept cookies. In the first case, the CNIL announced it had fined Microsoft Ireland Operations Limited €60 million for depositing cookies on for purposes that included advertising without consent and a failure to provide a mechanism to refuse cookies that was just as easy to use as the mechanism offered to accept cookies. Separately, the CNIL announced it fined TikTok European affiliates €5 million for making the mechanism to refuse cookies more complex than the mechanism to accept them on

ICO Publishes List of Reported Breaches
The United Kingdom Information Commissioner’s Office (“ICO”) published for the first time a list of entities that had self-reported personal data breaches to the ICO, as well as a list of civil investigations undertaken by the ICO and data protection complaints the ICO had received. The data was available from the fourth calendar quarter of 2020 to the first calendar quarter of 2022, and included the reporting organization’s name and sector, the issues involved, and the outcome of any action taken by the ICO. The self-reported breach list did not include cases that referred to the ICO’s investigations department for possible regulatory action. Cases that were referred to investigations appear on separately maintained lists that were also made publicly available by the ICO.

Irish DPC Fines WhatsApp for 5.5M Euros
The DPC completed its investigation into WhatsApp Ireland (“WhatsApp”) and fined the company €5.5 million for the alleged breach of its obligations under Articles 12 and 13(1)(c) of the EU’s General Data Protection Regulation (“GDPR”) in relation to transparency. More specifically, the DPC found that WhatsApp forced users into consenting to the processing of their data in its Terms of Service and did not clearly outline to users what processing operations were being carried out on their personal data, for what purpose, and by reference to which of the six legal bases identified in Article 6 of the GDPR. WhatsApp has six months to bring its data processing practices into compliance with the GDPR.


© 2023 Blank Rome LLP. All rights reserved. Please contact Blank Rome for permission to reprint. Notice: The purpose of this update is to identify select developments that may be of interest to readers. The information contained herein is abridged and summarized from various sources, the accuracy and completeness of which cannot be assured. This update should not be construed as legal advice or opinion, and is not a substitute for the advice of counsel.