Welcome to this month’s issue of The BR Privacy & Security Download, the digital newsletter of Blank Rome’s Privacy, Security, & Data Protection practice. We invite you to share this resource with your colleagues and visit Blank Rome’s Privacy, Security, & Data Protection webpage for more information about our team.
STATE & LOCAL LAWS & REGULATIONS
Missouri Attorney General Issues Social Media Content Moderation Rules: The Missouri Attorney General has introduced proposed rules under the Missouri Merchandising Practices Act aimed at regulating content moderation on large social media platforms (i.e., those with at least 50 million active U.S. users and 1 billion worldwide monthly users). These rules would require such platforms to let users choose a third-party content moderator instead of relying on the platform’s own content moderation. Upon account activation and at least every six months, users must be presented with a choice screen to select from available third-party moderators, with no default selections or platform favoritism. Platforms must provide interoperable access to third-party moderators and are prohibited from overriding their decisions, except in specific cases such as compliance with federal law, prevention of child exploitation, incitement of criminal activity, or other illegal content. The proposed rules are currently open for public comment for 30 days after their publication in the Missouri Register.
CPPA Issues Additional Modifications to Proposed Regulations: The California Privacy Protection Agency (“CPPA”), the regulatory authority charged with enforcing the California Consumer Privacy Act, as amended by the California Privacy Rights Act (“CCPA”), has once again modified its proposed regulations on cybersecurity audits, risk assessments, and automated decision-making technology (“ADMT”). Key changes include: (i) less burdensome requirements on cybersecurity audits, including the ability to leverage existing frameworks for such audits and staggered deadlines for the completion of such audits based on revenue; (ii) narrowing the content of risk assessments for ADMT, including allowing businesses to rely on assessments conducted for other state laws; (iii) narrowing the definition of ADMT to cover only technologies that replace human decision-making for significant consumer decisions, such as financial or employment outcomes; and (iv) limiting consumer rights regarding ADMT, focusing on transparency and access without requiring disclosure of sensitive business information. Public comment on the modified regulations closed on June 2, 2025.
CPPA and UK ICO Sign Declaration of Cooperation: The CPPA and the UK Information Commissioner's Office (“UK ICO”) have signed a declaration of cooperation. Announced on April 30, 2025, this agreement is a significant step for the CPPA in enhancing its collaboration with international data protection authorities. Signed by the UK ICO Commissioner and CPPA head of enforcement, the declaration formalizes existing cooperation by allowing the agencies to facilitate joint research and education regarding new technologies and data protection issues. They will also share best practices, knowledge, and investigative methods, hold staff meetings, and develop mechanisms for mutual collaboration. The partnership aims to deepen the CPPA’s knowledge base and leverage best practices from other regulators dealing with similar privacy concerns. This agreement with the UK ICO is the CPPA’s third such international collaboration, following previous declarations with South Korea in January 2025 and France in June 2024.
Proposed Amendments to Colorado AI Act Fail to Pass Before End of Legislative Session: Colorado State Senators Robert Rodriguez (D-CO) and Brianna Titone (D-CO) introduced legislation that would make significant changes to the state’s comprehensive artificial intelligence (“AI”) law passed last year, the Colorado AI Act. The amendments were introduced late in the legislative session and failed to pass prior to the legislature adjourning for the year. However, the proposed amendments demonstrate concern with over-regulation of AI in Colorado. The amendments would have amended certain definitions, provided companies with more time to comply with the law, and provided a tiered implementation schedule for smaller companies. The proposed amendment would have changed the definition of algorithmic discrimination to clarify that a discriminatory algorithm is one that is used in a manner that violates existing anti-discrimination laws. The amendments would also have updated the definition of “developer” to exclude entities that “intentionally and substantially” modify AI systems, among other important changes to the scope and timing of the AI law.
Nebraska and Vermont Legislatures Pass Age-Appropriate Design Bills: The State Legislatures of Nebraska and Vermont joined states such as Maryland and California that have passed age-appropriate design code legislation intended to safeguard children from harm when using online services. Among other things, Nebraska’s bill would include provisions to enhance the protection of children’s online data, give parents the ability to manage and control privacy and account settings, and limit the ability of companies to target children through product designs that encourage excess use. The Vermont legislation would require businesses to provide tools for minors to manage their privacy settings and delete their accounts and would prohibit sending push notifications during late night hours. The Nebraska law would go into effect on January 1, 2026, and the Vermont law would go into effect on July 1, 2026. Other state age-appropriate design laws have been challenged on First Amendment grounds.
California SB 690 Passes in the Senate: California Senate Bill 690 recently passed the Senate in a unanimous vote, marking a significant step toward limiting predatory lawsuits under the California Invasion of Privacy Act (CIPA) against online businesses. The bill aims to curb the surge of litigation targeting online businesses for technical privacy violations. It now moves to the Assembly for further consideration and, if enacted, could provide much-needed relief for companies facing costly and often questionable CIPA claims. Read our alert for more information on the bill.
FEDERAL LAWS & REGULATIONS
U.S. Senators Re-Introduce Kids Online Safety Act (“KOSA”): U.S. Senators Richard Blumenthal (D-CT), Marsha Blackburn (R-TN), John Thune (R-SD), and Chuck Schumer (D-NY) re-introduced the bipartisan KOSA. KOSA establishes requirements for certain online platforms to protect minors (i.e., individuals under the age of 17) from foreseeable harm. Covered platforms must implement safeguards, including privacy controls, limitations on addictive design features, parental tools, and clear reporting mechanisms for harm. Parental tools must allow for the management of a minor’s privacy settings, restricting purchases the minor can make on the platform and monitoring the minor’s time spent online. Covered platforms are prohibited from advertising illegal products to minors and must provide transparent disclosures about their practices, including how their recommendation systems work. KOSA requires annual independent audits and public reports for large platforms. If passed, KOSA will become effective 18 months after enactment and be enforced by the Federal Trade Commission (“FTC”) and State Attorneys General.
Bipartisan TAKE IT DOWN Act Signed Into Law: The TAKE IT DOWN Act (“the Act”) was signed into law by President Donald Trump on May 19, 2025, after receiving resounding bipartisan support in the Senate and House. The bill addresses nonconsensual intimate images (“NCII”), specifically targeting both authentic content and “digital forgery” created using AI or machine learning. The Act establishes criminal penalties for knowingly publishing authentic or synthetic NCII, including fines and imprisonment, with stiffer sentences for images depicting minors. Threatening to publish such content is also prohibited. The Act clarifies that sharing an image with one person does not constitute consent for broader publication. Additionally, the Act requires covered platforms and online services primarily hosting user content to remove unlawful NCII and known copies within 48 hours of a valid request. The FTC is tasked with enforcing these removal obligations.
House Passes 10-Year Ban on State AI Regulation in Budget Reconciliation Bill: The U.S. House of Representatives included a provision in its version of the budget reconciliation bill that would prohibit states from enacting and enforcing AI laws “limiting, restricting, or otherwise regulating” AI systems or automated decision-making systems that are “entered into interstate commerce.” House Republicans that supported the provision stated in a House Energy and Commerce subcommittee hearing that it is needed to prevent uneven regulation by a patchwork of state laws and to promote U.S. AI innovation. 40 State Attorneys General signed on to a letter voicing opposition to the 10-year moratorium, calling it an “irresponsible amendment” that would prevent states from protecting consumers. More than 140 civil rights and consumer advocacy groups followed suit, sending a letter to House leadership stating that Congress’s failure to put in place a national AI framework has made it imperative that states be allowed to put in place regulations to prevent potential harm. The moratorium faces an uncertain future in the Senate, where some Republican Senators have expressed a need for existing state laws to protect consumers.
Regeneron Winning Bidder in 23andMe Bankruptcy Sale: Regeneron Pharmaceuticals agreed to pay $256 million to purchase 23andMe, including its personal genome service data, biobank, and other assets. In a statement, Regeneron Genetics Center emphasized its commitment to using 23andMe customer data ethically. 23andMe’s bankruptcy has sparked significant concern about the privacy of the genetic information 23andMe has collected. A number of regulators, including the FTC, the California Attorney General, the Privacy Commissioner of Canada, and the UK ICO, issued statements calling for protections for consumer genetic information subject to the sale, and urging consumers to be diligent in exercising their rights with respect to their genomic data. 23andMe told a meeting of creditors in early May that 1.3 million customers have asked the company to cancel their accounts and delete their data since the company entered bankruptcy. A bipartisan group of U.S. Senators have introduced legislation in response to the high-profile bankruptcy. The proposed legislation would amend the U.S. Bankruptcy Code to require consumers to affirmatively consent to the use, sale, or lease of genetic information during bankruptcy.
CISA Issues Guidance to Critical Infrastructure Operators on Cyber Defense: The Cybersecurity and Infrastructure Security Agency (“CISA”) issued guidance for critical infrastructure operators to defend against cyberattacks affecting operational technology (“OT”) and industrial control systems (“ICS”). Recommendations include removing OT connections from the internet; changing default passwords immediately and using strong, unique passwords; ensuring remote access to OT networks is configured securely; segmenting OT and ICS networks; and testing business continuity and disaster recovery plans.
U.S. LITIGATION
Advertising Industry Groups Urge Supreme Court to Take Up VPPA Cases: The Interactive Advertising Bureau and other advertising industry groups have requested the U.S. Supreme Court (“SCOTUS”) to rule that companies are not in violation of the Video Privacy Protection Act (“VPPA”) by embedding online tracking technologies on their websites. They are urging SCOTUS to hear the National Basketball Association’s (“NBA”) appeal of the Second Circuit Court of Appeals’ ruling in NBA v. Salazar, which held that website users are considered “consumers” under the VPPA and that the collection and sharing of video viewing history on the NBA website with Facebook via the Meta Pixel without notice and consent was a violation of the VPPA. The advertising industry groups argue that the VPPA, enacted in 1988 to protect video rental privacy, is outdated and was never intended to regulate modern online advertising practices. They warn that applying the VPPA to online tracking threatens targeted advertising, especially for small businesses.
Broadcasting Company Wins Dismissal of VPPA Suit: Hearst Television (“Hearst”) has successfully won the dismissal of a lawsuit in the U.S. District Court for the District of Massachusetts. The suit, brought by plaintiff Charles Therrien, alleged that Hearst unlawfully disclosed his personally identifiable information (“PII”) from its news app to third parties, specifically Braze and Google, in violation of the VPPA. The plaintiff argued this disclosure allowed Hearst to increase revenue through targeted advertising. Judge Richard G. Stearns granted Hearst’s motion for summary judgment, finding Hearst did not violate the VPPA. Judge Stearns determined that the shared data did not constitute PII under the law. Specifically, he noted that a single geolocation data point was insufficient to identify an individual. Judge Stearns also concluded that the disclosure of the plaintiff’s email address to Braze fell under the VPPA’s ordinary course of business exception, as it was incident to order fulfillment or request processing for services like push notifications and newsletters.
Georgia's Social Media Age Limit Law Challenged: Internet trade group NetChoice has filed a lawsuit against Georgia Attorney General Christopher M. Carr, challenging Senate Bill 351 (“SB 351”), a new law scheduled to take effect on July 1, 2025. NetChoice, which represents Internet companies like Meta and Reddit, argues that SB 351 unconstitutionally regulates minors’ access to protected online speech, impairing adults’ access as well. The lawsuit challenges several key provisions: (i) requiring age verification for all account holders or treating everyone as a minor, (ii) mandating parental consent for minors under 16 to create accounts, and (iii) restricting advertising displayed to minors based on personal information. NetChoice contends SB 351 violates the First Amendment, is unconstitutionally vague, fails strict scrutiny due to its content and speaker-based definitions, and is not narrowly tailored, pointing to existing parental control tools as less restrictive alternatives.
California Appellate Court Finds Restaurant Still Liable for Payment in Personal Injury Suit After Making Payment to Fraudulent Account: The California Court of Appeal for the Fourth Appellate District affirmed a lower court decision finding a restaurant group that was induced by an imposter to make a settlement payment to a fraudulent account was still liable for that payment to the proper party. In Thomas v. Corbyn Restaurant Development Corp., the court stated that which party bears risk of loss when an imposter causes one party to wire settlement proceeds to an imposter instead of the other settling party was an issue of first impression for California courts. In this case, an unknown third-party purporting to be the plaintiff’s counsel sent spoofed e-mails to the defendant’s counsel with fraudulent wire instructions. The defendant’s counsel then wired the payment to the fraudulent account. When the fraud was discovered, the plaintiff requested payment and the defendant refused to pay. The trial court had followed federal case law following the general rule that the party best in a position to prevent the fraud should bear the risk of making a fraudulent payment. The appellate court affirmed the trial court’s judgment, finding that the trial court correctly applied the line of persuasive federal cases as the defendant was best placed to verify the authenticity of the e-mail prior to sending the payment.
Court Declines to Lift Injunction Protecting U.S. Privacy and Civil Liberties Oversight Board Members from Dismissal: Following a decision from SCOTUS to lift a freeze on President Trump’s firing of members of the National Labor Relations Board (“NLRB”) and Merit Systems Protection Board (“MSPB”), the U.S. District Court for the District of Columbia declined to lift its injunction on firing members of the U.S. Privacy and Civil Liberties Oversight Board (“PCLOB”). The District Court distinguished PCLOB activities from those of the NLRB and MSPB, stating that, unlike the NLRB and MSPB, the PCLOB does not exercise executive authority. Rather, the PCLOB makes recommendations that the executive branch is free to adopt or ignore, so preserving the jobs of the dismissed board members would not disrupt the President’s actions. In granting the original injunction, the District Court found there is a substantial public interest in the effective oversight of government counterterrorism action and authorities that is furthered by the expert and candid advice provided by the PCLOB.
U.S. ENFORCEMENT
FTC Requires AI Detection Tool Provider to Back Up Accuracy Claims: The FTC issued a proposed order against Workado, LLC (“Workado”). Workado provides AI Content Detector, a product that helps users determine whether online content was developed by artificial intelligence (“AI”). The FTC alleges that while Workado claimed that AI Content Detector was developed using a wide range of material (e.g., blog posts and Wikipedia entries) to improve accuracy, AI Content Detector was only trained to effectively classify academic content. Workado also claimed that AI Content Detector was 98 percent accurate, but the FTC’s independent testing revealed AI Content Detector’s accuracy rate on general-purpose content was only 53 percent. The FTC’s proposed order prohibits Workado from misrepresenting AI Content Detector’s accuracy without sufficient evidence, requires Workado to retain any evidence it uses to support its accuracy claims, e-mail eligible consumers about the order and settlement with the FTC, and submit compliance reports to the FTC annually for three years.
FCC Urges Second and D.C. Circuit Courts to Uphold Telecommunication Providers Fines: The Federal Communications Commission (“FCC”) sent letters to the D.C. Circuit and Second Circuit Courts of Appeal, urging them not to follow the Fifth Circuit Court of Appeals’ decision in invalidating the FCC’s fines against wireless carriers. In April 2024, the FCC fined Verizon $47 million and T-Mobile $80 million for failing to properly vet third parties before selling them customer location data without customer consent. However, the Fifth Circuit overturned a $57 million fine against AT&T, finding that the FCC’s forfeiture process violated the Seventh Amendment right to a jury trial. Companies can obtain a jury trial by refusing to pay the fine and waiting for the Department of Justice to initiate a collection action. The FCC argued in its letters that the Fifth Circuit’s approach should not be adopted by other courts, emphasizing that the decision was based on the Fifth Circuit’s own precedent.
Defense Contractor Settles with DOJ for Alleged Cybersecurity Violations: The U.S. Department of Justice (“DOJ”) has settled with Raytheon Company, RTX Corporation, Nightwing Group LLC, and Nightwing Intelligence Solutions LLC (collectively, “Raytheon”) to resolve allegations that the companies violated the False Claims Act (“FCA”) by failing to comply with cybersecurity requirements in contracts or subcontracts involving the Department of Defense (“DoD”). The DOJ alleged that Raytheon failed to develop and implement a system security plan for the system, as required by DoD cybersecurity regulations, and failed to ensure that the system complied with other cybersecurity requirements contained in the Defense Federal Acquisition Regulation Supplement (“DFARS”) 252.204-7012 and Federal Acquisition Regulation (“FAR”) 52.204-21. The DOJ alleged that Raytheon used its noncompliant internal system to develop, use, or store covered defense information and federal contract information during its performance on 29 DoD contracts and subcontracts. The settlement requires Raytheon to pay the DOJ $8.4 million.
Texas Attorney General Issues Privacy Law Compliance Warnings to Several Chinese Companies: The Texas Attorney General has notified TP-Link, Alibaba, CapCut, and several other Chinese and Chinese Communist Party (“CCP”) aligned companies that they are in violation of the Texas Data Privacy and Security Act (“TDPSA”). While the Texas Attorney General did not specify how these companies were in violation of the TDPSA, he stated that they must “protect Texans’ data from falling into the hands of the CCP.” These companies have 30 days to remedy their alleged violations. The Texas Attorney General has the authority to bring enforcement actions for violations of the TDPSA and to collect civil penalties of up to $7,500 per violation, injunctive relief, attorney fees, and other expenses incurred in investigating and bringing the matter. This latest action is part of the Texas Attorney General’s data privacy and security initiative and follows his investigation of DeepSeek, which the Texas Attorney General states is affiliated with CCP.
Texas Attorney General Finalizes Settlements Regarding Incognito Mode and Biometric Data: The Texas Attorney General has entered into a settlement agreement with Google for claims that Google unlawfully tracked and collected its users’ data. The Texas Attorney General sued Google in 2022 for allegedly misleading users into thinking that their geolocation, search history, and other data were private while using Google's incognito feature. The Texas Attorney General also sued Google for its products allegedly allowing Google to collect and process biometric data, such as “the unique characteristics of an individual's face and voice” without users’ consent, in violation of Texas’ Capture or Use of Biometric Identifiers Act. The settlement requires Google to pay $1.375 billion. To date, no state has attained a settlement against Google for similar data privacy violations greater than $93 million.
Michigan AG Sues Streaming Company over Children’s Data Privacy Concerns: On April 29, 2025, the Michigan Attorney General filed a lawsuit against Roku, Inc. (“Roku”) in the U.S. District Court for the Eastern District of Michigan. The lawsuit alleges that Roku violates the Children’s Online Privacy Protection Act (“COPPA”) and the Michigan Consumer Protection Act. According to the complaint, Roku systematically collected and processed the personal information of children, including location data, voice recordings, IP addresses, and browsing histories, without providing the required notice or obtaining parental consent. The suit claims Roku allows and partners with third parties, including web trackers and data brokers, to collect and monetize children’s personal information. The lawsuit also accuses Roku of misleading parents about data collection and privacy settings and violating the VPPA by disclosing users’ viewing history to third parties.
CPPA Orders Florida Data Broker to Pay $46,000 Fine Under Delete Act: The CPPA has ordered Florida-based data broker Jerico Pictures, Inc., d/b/a National Public Data (“National Public Data”), to pay a $46,000 fine. This is the maximum penalty available under the Delete Act for the company's failure to register and pay an annual fee. National Public Data was 230 days late registering as a data broker, doing so only after the CPPA’s Enforcement Division contacted them during an investigation. The order was issued by default as the company did not challenge the allegations. The Delete Act mandates data broker registration to fund the California Data Broker Registry and the upcoming Data Broker Requests and Opt-out Platform (“DROP”), which allows consumers to request that any personal information held about them be deleted by all registered data brokers.
FTC Finalizes Order with GoDaddy over Data Security Failures: The FTC has finalized an order with web hosting provider, GoDaddy, to settle allegations of data security failures. The FTC claimed GoDaddy misled customers by failing to implement standard data security practices, including multi-factor authentication and threat monitoring, despite advertising “award-winning security.” These failures resulted in several data breaches between 2019 and 2022, allowing unauthorized access to customer websites and data. The order prohibits GoDaddy from misrepresenting its security and compliance with privacy programs. GoDaddy will also be required to establish a comprehensive information security program, secure connections, manage updates, and implement mandatory multi-factor authentication.
CPPA Fines Retailer for CCPA Violations: The CPPA announced that it entered into a settlement with retailer Todd Snyder, Inc. (“Todd Snyder”) to resolve allegations it failed to comply with the CCPA. Specifically, the CPPA alleged that Todd Snyder failed to process consumer requests to opt out of the sale or sharing within 40 days as a result of its failure to properly configure its privacy portal, required consumers to submit more information than necessary to process their privacy requests, and required consumers to verify their identity before allowing them to opt-out of the sale or sharing of their personal information. Todd Snyder has agreed to pay a $345,178 fine to resolve the allegations. Todd Snyder will also implement changes in mechanisms and practices it uses to receive consumer rights requests and provide CCPA compliance training for its employees.
OCR Settles with Healthcare Provider over Unauthorized Access to Medical Records: The U.S. Department of Health and Human Services Office for Civil Rights (“OCR”) announced it had entered into a settlement with BayCare Health System (“BayCare”) stemming from a complaint OCR had received from a BayCare patient. The patient reported to OCR that she was contacted by an unknown individual who had photographs of her medical records as well as a video of someone scrolling through her records on a computer screen. The OCR determined that credentials used to access the patient’s medical record belonged to a non-clinical former staff member of a physician’s practice who had access to BayCare’s electronic medical records for purposes of continuity of care for common patients. Under the terms of the settlement, BayCare will pay an $800,000 fine and implement a corrective action plan pursuant to which BayCare will implement a number of corrections to address alleged violations of the Health Insurance Portability and Accountability Act (“HIPAA”) Security Rule. OCR will monitor the corrective action plan for two years.
INTERNATIONAL LAWS & REGULATIONS
EDPB and EDPS Adopt Letter to European Commission Regarding GDPR Simplification: The European Data Protection Board (“EDPB”) and the European Data Protection Supervisor (“EDPS”) adopted a letter to the European Commission regarding a proposal on the simplification of record-keeping obligations under the General Data Protection Regulation (“GDPR”). The EDPB and EDPS expressed preliminary support for a targeted initiative to simplify GDPR Article 30(5) compliance requirements that would extend the derogation to the obligation to maintain records of processing to cover organizations with fewer than 500 employees that had specified annual revenue and non-profits with fewer than 500 employees. The derogation currently applies to organizations with less than 250 employees unless the processing it carries out is likely to result in a risk to the rights and freedoms of data subjects. Any formal amendment to the GDPR would follow formal consultation among European Union bodies after publication of the draft legislative change.
Italian Data Protection Authority Fines U.S. Chatbot Company: The Italian Data Protection Authority (the “Garante”) has fined Luka Inc. (“Luka”), which provides the “Replika” chatbot, €5 million and launched an investigation into the processing of personal data by the generative AI system underlying the service. In 2023, the Garante informed Luka that it had failed to identify legal bases for the processing of personal data to provide Replika and that the company’s privacy notice did not comply with the GDPR in several respects. Additionally, the Garante found that Luka did not provide an adequate age verification mechanism for the service. The Garante has issued a request for information to the company as part of its new investigation into the underlying AI technology, asking Luka to provide details on risk assessments and measures used to protect data during the development and training of the AI model used by Replica, the categories and types of personal data processed, and whether anonymization or pseudonymization measures have been implemented.
Korean Data Protection Authority Fines Retailer over Data Transfer Violations: Korea’s Personal Information Protection Commission (“PIPC”) has fined e-commerce company Temu ₩1.3 billion (approximately $980,000) over violations of the Korean Personal Information Protection Act (“PIPA”) related to international data transfers. The PIPC alleges that Temu engaged in data transfers to Japan, Singapore, and China without properly notifying users or properly disclosing the transfers in its privacy notice, as required by the PIPA. Temu also failed to appoint a local representative in Korea.
Quebec Data Protection Authority Issues Guidance on Personal Data Transfers and Access Limitations: Quebec’s Commission d’accèss à l’information du Quebéc (the “Commission”) issued guidance relating to complying with the province’s data privacy laws as they relate to data transfers and access restrictions. The guidance states that companies must limit access to authorized persons for whom the data is necessary for the performance of their duties. Whether access is necessary depends on the company’s analysis of the business purposes for which the company is authorized to process personal data, which requires an analysis of notifications provided by the company and what consents have been obtained from data subjects. Similarly, the Commission states that companies must analyze the necessity of data transfers and any consent they may have obtained for such transfers, as well as whether any circumstances exist that may permit transfer without consent.
RECENT PUBLICATIONS & MEDIA COVERAGE
Pay Up or Lawsuit Up: The 30-Day Countdown That’s Fueling Arbitration Disputes
Blank Rome partners Harrison Brown, Ana Tagvoryan, and associates Victor J. Sandoval, and Alexander D. Newman authored this alert discussing the recent wave of arbitration demands under the California Invasion of Privacy Act.
Digital Blind Spots Leave Fintech Firms Exposed as Cyber Litigation Explodes
Blank Rome partner Phillip N. Yanella was featured in this PYMNTS article discussing the rise of data breach lawsuits being filed in recent years.
Blank Rome partner Jeffrey N. Rosenthal was featured in this Cybersecurity Law Report article discussing biometric law compliance considerations in the wake of recent privacy complaints.
© 2025 Blank Rome LLP. All rights reserved. Please contact Blank Rome for permission to reprint. Notice: The purpose of this update is to identify select developments that may be of interest to readers. The information contained herein is abridged and summarized from various sources, the accuracy and completeness of which cannot be assured. This update should not be construed as legal advice or opinion, and is not a substitute for the advice of counsel.