The BR Privacy & Security Download: May 2023
Welcome to this month's issue of The BR Privacy & Security Download, the digital newsletter of Blank Rome’s Privacy, Security & Data Protection practice. We invite you to share this resource with your colleagues and visit Blank Rome’s Privacy, Security & Data Protection webpage for more information about our team.
STATE & LOCAL LAWS & REGULATIONS
Washington State Passes Health Data Privacy Law
Washington State enacted H.B. 1155, the My Health, My Data Act (the “Act”), to increase privacy protections for health data collected from Washington consumers and residents. The Act broadly defines protected “consumer health data” to mean personal information about a consumer’s physical or mental health status, such as reproductive and gender-affirming care information, biometric data, and precise geo-location data linked to health services or supplies. The Act imposes disclosure and opt-in consent requirements for the collection, sale, or sharing of consumer health data by “regulated entities,” effective March 31, 2024, and “small businesses,” effective June 30, 2024. The Act also restricts geo-fencing around healthcare facilities and grants consumers rights to access, delete, and restrict the sale or sharing of their health data, as well as a private right of action.
California Chamber of Commerce Requests Delay for Enforcement of California Privacy Rights Act
The California Chamber of Commerce has filed a petition in California state court for a delay in the enforcement of the California Privacy Rights Act (“CPRA”) regulations. The California Privacy Protection Agency (“CPPA”), the agency with rulemaking and enforcement authority for the CPRA, was required to adopt final regulations to the CPRA by July 1, 2022, one year before the scheduled enforcement date. However, the CPPA adopted partial final regulations on March 30, 2023, leaving businesses with only around three months to come into compliance. The Chamber of Commerce argued that the CPPA’s “piecemeal approach and disregard of statutory deadlines” have not given businesses sufficient time to prepare for the CPRA regulations.
Indiana Passes Consumer Data Privacy Act
Indiana became the seventh state to enact a comprehensive data privacy law with the passage of S.B. 5, the Indiana Consumer Data Privacy Act (“ICDPA”). The ICDPA closely mirrors the Virginia Consumer Data Protection Act and largely follows existing state privacy laws in its scope and applicability, data controller obligations and requirements, and consumer protections, including the consumer rights to know, access in a portable format, correct, delete, and to opt out of the sale and certain processing of their personal data. However, under the ICDPA, the right to correct is limited to personal data previously provided by the consumer, and the right to access may also be satisfied by providing the consumer with a “representative summary” of their personal data. Violations of the ICDPA will be enforced by the Attorney General. Indiana has provided a longer period for businesses to prepare for the ICDPA than other states that have passed comprehensive privacy laws. The ICDPA will go into effect on January 1, 2026.
Montana Passes Comprehensive Privacy Law
The Montana Legislature has passed the Consumer Data Privacy Act (S.B. 384) (“MCDPA”), adding to the growing patchwork of state comprehensive privacy laws. The MCDPA protects consumers, defined as Montana residents, and like the laws of Colorado, Connecticut, Virginia, and Utah, specifically excludes residents acting in a commercial or employment context. Similar to prior state comprehensive privacy laws, the MCDPA obligates data controllers to provide a privacy notice, obtain consent before processing sensitive data, enter into written contracts containing specific clauses with processors, and conduct data protection assessments. The MCDPA also provides consumers the rights to access, data portability, correct, delete, and opt out of targeted advertising, sale of personal data, and profiling. Similar to Colorado and California, beginning January 1, 2025, controllers must allow a consumer to opt out of targeted advertising and the sale of personal data through an opt-out preference signal. The MCDPA gives exclusive enforcement authority to the Montana Attorney General with a 60-day cure period that sunsets on April 1, 2026. If signed by the Montana Governor, the MCDPA will take effect on October 1, 2024.
Tennessee Passes Information Protection Act
Tennessee lawmakers passed and sent to Governor Bill Lee for signature, the Tennessee Information Protection Act (“TIPA”), which will go into effect on July 1, 2025. The TIPA largely follows provisions found under existing state data privacy laws, aligning most closely with Virginia’s regulations, with a narrower scope of applicability. However, the TIPA uniquely deviates from existing state privacy laws by requiring covered entities to adhere to the National Institution of Standards and Technology (“NIST”) privacy framework, and compliance with the framework may serve as an affirmative defense against enforcement actions brought by the Tennessee Attorney General for violations of the TIPA. The TIPA also provides a 60-day cure period provision with no sunset date.
New York Attorney General Releases Data Security Guidance
The New York Attorney General released a guide to help businesses adopt effective data security measures to better protect New York residents’ personal information (“Data Security Guide”). The Data Security Guide discusses some data security failures found in recent data security investigations and recommends practices businesses should adopt to better secure their systems, fortify their networks, and strengthen their data security measures. The Data Security Guide recommends that businesses: (i) maintain controls for secure authentication (e.g., multi-factor authentication and complex and unique passwords); (ii) encrypt sensitive customer information like social security numbers; (iii) ensure vendors use appropriate data security measures to safeguard the information they have access to; (iv) maintain an asset inventory that tracks where customer information is stored; (v) guard against credential stuffing; and (vi) notify customers in a timely and accurate way in the event of a data breach.
New York City Department of Consumer and Worker Protection Adopts Final Rules on Automated Employment Decision Tools
The New York City Department of Consumer and Worker Protection (“DCWP”) adopted final rules to implement New York City’s Local Law 144 regarding automated employment decision tools (“AEDTs”). The final rules were adopted with changes made based upon the public comments received by the DCWP. Some of the changes include: (i) modifying the definition of “machine learning, statistical modeling, data analytics, or artificial intelligence” to expand its scope; (ii) adding a requirement that published results of a bias audit must include the number of individuals the AEDT assessed that are not included in the calculations because they fall within an unknown category, and requiring that number be included in the summary of results; and (iii) clarifying when an employer or employment agency may rely on a bias audit conducted using the historical data of other employers or employment agencies.
Arkansas Passes Children’s Online Privacy Laws
Arkansas enacted S.B. 396, the Social Media Safety Act, and S.B. 66 (renamed to Act 612), the Protection of Minors from the Distribution of Harmful Material Act (collectively, the “Acts”), to regulate social media usage by Arkansas minors. Both Acts follow similar regulation recently passed in Utah, requiring companies that control social media platforms used by Arkansas consumers to engage in “reasonable age verification” of their users and account holders. S.B. 396 specifically requires this process to be done through third-party vendors and prohibits Arkansas minors under the age of 18 from creating an account on the platform without express parental consent. S.B. 66 further establishes liability for the publication or distribution of pornographic material harmful to minors. Notably, S.B. 396 uniquely carves out an exemption for companies that offer “enterprise collaboration tools” for K-12 schools that derive less than 25 percent of their revenue from operating any social media platform(s). However, it is unclear how broadly this exemption will be interpreted upon S.B. 396’s effective date of September 1, 2023.
Montana Passes Statewide Ban on TikTok
Montana has become the first U.S. state to pass legislation (S.B. 419) banning TikTok on personal devices. The bill would prohibit TikTok from operating within and prohibit app stores from offering TikTok within the state of Montana. Failure to comply with the bill would result in a fine of $10,000 per violation. The only affirmative defense that is offered is if the violating entity could not have reasonably known that the violation occurred within the state of Montana. If the Montana Governor signs the bill, the ban would go into effect on January 1, 2024.
FEDERAL LAWS & REGULATIONS
U.S. Department of Health and Human Services Announces Expiration of COVID Enforcement Discretion
The U.S. Department of Health and Human Services Office for Civil Rights (“OCR”) announced that the four Notifications of Enforcement Discretion issued under the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) in 2020 and 2021 will expire at 11:59 p.m. on May 11, 2023, due to the expiration of the COVID-19 public health emergency. Under the Notifications of Enforcement Discretion, OCR was required to refrain from imposing financial penalties for violations of certain provisions of HIPAA and provided covered entities with flexibility in setting up community-based COVID-19 testing sites, disclosing testing data to health authorities, using web-based scheduling applications for scheduling COVID-19 vaccination appointments, and conducting telehealth appointments. OCR had previously stated that it would provide covered entities with sufficient time to comply with HIPAA regarding telehealth, so OCR will provide a 90-day transition period (May 12, 2023, to August 9, 2023), during which time financial penalties will not be imposed for non-compliance.
Federal Agencies Release Joint Statement on Artificial Intelligence
Federal Trade Commission Chair Lina M. Kahn, along with officials of the U.S. Department of Justice Civil Rights Division, the Consumer Financial Protection Bureau, and the U.S. Equal Employment Opportunity Commission released a joint statement outlining a commitment to enforce their respective laws and regulations to promote responsible innovation in automated systems. The statement emphasizes the fact that the agencies’ enforcement authorities extend to automated systems and how automated systems may contribute to unlawful discrimination and otherwise violate federal law due to problems with data and datasets and design and use of the systems, as well as the lack of transparency with the internal workings of such systems. The statement reinforces that the agencies will “vigorously” use their collective authorities to protect individuals’ rights.
HHS Issues Notice of Proposed Rulemaking to Enhance Reproductive Health Care Confidentiality
The U.S. Department of Health and Human Services (“HHS”) issued a Notice of Proposed Rulemaking (“NPRM”) designed to strengthen HIPAA Privacy Rule protections for protected health information relating to reproductive care. The proposed rule would prohibit the use or disclosure of protected health information (PHI) to investigate or prosecute patients, providers, and others involved in the provision of legal reproductive health care, including abortion care. Reproductive health care would be defined to include, but not be limited to, prenatal care, abortion, miscarriage management, infertility treatment, contraception use, and treatment for reproductive-related conditions such as ovarian cancer. The NPRM is responsive to Executive Order 14076, issued by the Biden Administration following the Supreme Court’s decision to overturn Roe v. Wade, which directed HHS to consider ways to strengthen the protection of sensitive information related to reproductive health care services and enhance patient-provider confidentiality.
Online Privacy Act Reintroduced
Two House Democrats from California have reintroduced the federal Online Privacy Act of 2023 (“OPA”), a proposed comprehensive federal privacy law that would grant U.S. consumers rights to access, correct, and delete their data. If enacted, the OPA would also place limits and obligations on companies similar to those found under existing state comprehensive data privacy laws and create a Digital Privacy Agency tasked with enforcing the OPA along with state attorneys general and nonprofit representatives appointed by consumers in private class action lawsuits. Unlike the American Data Privacy and Protection Act, another federal privacy bill that faced opposition for its preemption provisions that would override existing state law protections for consumer data, the OPA seeks to set a legislative floor that would allow state legislatures to add further protections and regulations in conjunction with OPA’s provisions.
CISA Releases Secure by Design Principles for Software Manufacturers
The Cybersecurity and Infrastructure Security Agency (“CISA”) published Shifting the Balance of Cybersecurity Risk: Principles and Approaches for Security-by-Design and -Default, a joint cybersecurity guidance developed together with the Federal Bureau of Investigation and the National Security Agency, among other international cybersecurity authorities, providing a recommended roadmap to implement for software manufacturers. The guidance focuses on key principles, including taking ownership of security, engaging in “radical transparency,” and encouraging C-suite leadership to prioritize product security for a safer and more secure online environment. The joint guidance also outlines minimum security standards and offers a recommended default baseline of security controls required to protect enterprises, including critical infrastructure owners and operators, from malicious cyber actors and to protect against potential liability. According to CISA Director Jen Easterly, the “secure by design and secure by default principles” are intended “to help catalyze industry-wide change across the globe to better protect all technology users.”
EARN IT Bill Reintroduced and Faces Similar Opposition
Senate Bill 1207, the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act (“EARN IT Act”), was reintroduced to the Senate Judiciary Committee in its third iteration. The EARN IT Act seeks to reduce online sharing of child sexual abuse material (“CSAM”) by, among other things, stripping liability protections granted to “providers of interactive computer services” under Section 230 of the Communications Decency Act where CSAM is shared by online users of those services. For example, under the EARN IT Act, the use of encryption technologies may serve as evidence (but not as “an independent basis for liability”) to establish that a company did not “earn” liability protections as such technologies could be used to avoid detection for the transmission of CSAM. Therefore, critics of the proposal continue to raise concerns that the EARN IT Act would ultimately lower consumers’ online protections by discouraging companies from offering encryption.
INTERNATIONAL LAWS & REGULATIONS
U.S. International Trade Administration Seeking Input on Data Privacy Framework
The International Trade Administration of the U.S. Department of Commerce released a Notice of Information Collection and Request for Comment on the proposed EU-U.S. Data Privacy Framework (“DPF”) and extensions of the framework for transfers of personal data from the UK and Switzerland. These frameworks were developed in the wake of the Court of Justice of the European Union’s Schrems II decision in 2020, invalidating the EU-U.S. Privacy Shield. The DPF is intended to provide U.S. organizations with reliable mechanisms for personal data transfers to the United States from the EU, the UK, and Switzerland while ensuring data protection that is consistent with EU, UK, and Swiss law. EU data protection authorities have scrutinized cross-border data transfers since the Schrems II decision, with Meta announcing in recent SEC filings that it could be forced to halt European operations because of an inability to transfer personal data from the EU if the EU does not finalize an adequacy decision relating to the DPF before an expected final decision from Ireland's Data Protection Commission on the legality of its EU-U.S. transfers. Several data protection authorities have also ruled that the use of common website analytics tools that transfer IP addresses and other personal data to the U.S. cannot be used without appropriate safeguards.
Members of European Parliament Adopt Resolution against EU-U.S. Data Privacy Framework
The Civil Liberties Committee of the European Parliament adopted a resolution arguing that the European Commission should not grant an adequacy decision relating to the proposed EU-U.S. Data Privacy Framework. The resolution states that the Data Privacy Framework is an improvement over the prior Privacy Shield and EU-U.S. Safe Harbor mechanisms, but it still does not provide sufficient safeguards. In the resolution, Members of the European Parliament (“MEPs”) stated that the Data Privacy Framework is insufficient because the Data Privacy Framework still allows for bulk collection of personal data in certain cases, does not make bulk data collection subject to independent prior authorization, and does not provide for clear rules on data retention, among other issues. The MEPs argue that the Data Privacy Framework as currently proposed will not withstand legal challenges and urge the European Commission to continue negotiating to develop a cross-border data transfer mechanism that will be held up in court and provide certainty for EU businesses and citizens.
Proposed AI Act Revised to Address Generative AI Tools
A revised version of the proposed EU AI Act that seeks to address generative AI tools was circulated. The revised text seeks to distinguish between general-purpose AI and “foundation models.” Foundation models include generative AI systems that are trained on data scraped from the entire Internet. Under the revised text, foundation models would be subject to stricter requirements than general-purpose AI systems. Among other requirements, providers of foundation models would be required to test the models and mitigate reasonably foreseeable risks to health, safety, fundamental rights, the environment, democracy, and the rule of law with the involvement of independent experts. Foundation models that fall in the generative AI category must also comply with further transparency obligations and implement adequate safeguards against generating content in breach of EU law.
RECENT PUBLICATIONS & MEDIA COVERAGE
NYC Introduces Bills to Limit Facial Recognition in Private Sector (Biometric Privacy Insider)