In 2025, BakerHostetler’s privacy attorneys advised hundreds of clients on complex privacy and data protection compliance challenges. As the year closes, our annual privacy review examines some key developments that will define companies’ compliance strategies in 2026. These range from changes in U.S. state consumer privacy laws and regulations, ongoing privacy litigation and enforcement actions, text message program compliance, and AI and personal data laws to developments in international data protection.
We are kicking off our 2026 Privacy Strategist CLE webinar series on January 13 with more insights into what the year might hold for privacy professionals.
State Privacy Enforcement – From Settlements to Sweeps
In 2025, our privacy team responded to inquiries and cure notices from more than half of the state regulators enforcing consumer privacy laws, as well as questions from the Federal Trade Commission (FTC). Unsurprisingly, this heightened regulatory activity coincided with an increase in state enforcement actions, led by the California Privacy Protection Agency (CPPA) and California’s attorney general, which secured multimillion-dollar settlements with companies across industry sectors and imposed fines on data brokers. Connecticut issued a penalty for a deficient privacy notice. Meanwhile, Texas appears to have prioritized litigation, securing a $1 billion settlement and filing new suits under its privacy and data broker laws. Florida’s first enforcement action under the Digital Bill of Rights targeted the sale of sensitive personal data without consent, highlighting alleged age verification failures.
Common violations across this year’s enforcement actions included inadequate privacy notices, failures to honor opt-out rights and Global Privacy Control (GPC) signals, misconfigured consent management tools, and missing data processing agreements. Injunctive relief provisions, such as mandated tracking technology scans, revised opt-out practices, and vendor contract audits, signal regulators’ expectations for proactive compliance programs.
With statutory cure periods expiring and many ongoing investigations, enforcement priorities will continue to shape privacy compliance strategies. Multistate collaboration is also growing, with recently launched joint GPC sweeps and the expanding Consortium of Privacy Regulators, now comprised of 10 states.
State Consumer Privacy Laws – Amendments and Regulations Take Center Stage
Although 2025 was the first year since 2020 without the passage of a new comprehensive state privacy law, the legislative landscape was far from quiet. Instead of sweeping new statutes, legislatures focused on amending existing laws – changes that will significantly influence applicability and compliance obligations in 2026 and beyond. State regulatory rulemaking also gained momentum in 2025. For companies, these shifts underscore the need to move beyond merely tracking new laws to actively monitoring amendments and regulations that could alter prior exemptions or reshape privacy program requirements.
In 2025, nearly half of the states with consumer privacy laws amended those statutes. Often, but not always, these amendments were intended to harmonize an existing consumer privacy law with other state consumer privacy laws. Some of the more notable amendments include:
- Connecticut’s Legislature again revamped the Connecticut Data Privacy Act, one of the more frequently amended state privacy laws. SB 1295, effective July 1, 2026, expands the law’s scope by lowering applicability thresholds (now including companies that process personal data of more than 35,000 consumers and those handling sensitive personal data). The amendments narrow existing exemptions, for example, by replacing the previous entity-level exemption for companies subject to the Gramm-Leach-Bliley Act (GLBA) with a new data-level exemption. They expand current sensitive personal data definitions to include health disability or treatment details and information derived from genetic or biometric data while adding neural data, nonbinary or transgender status, financial information, and government-issued IDs. The amendments also strengthen consumer rights by expanding access to inferences and profiling decisions, introducing a right to contest profiling decisions, and limiting disclosure of sensitive identifiers. Finally, SB 1295 introduces new obligations for automated decision impact assessments, which mirror Colorado’s AI rules, and mandates disclosures when large language models (LLMs) are used to process personal data.
- The Oregon Consumer Privacy Act was amended three times in 2025. Of note, Oregon lawmakers expressly banned the sale of precise geolocation data and the personal data of consumers under age 16 effective January 1, 2026. The Act also now explicitly applies to vehicle manufacturers and affiliates who control or process any personal data obtained from a consumer’s use of a motor vehicle or a component of a motor vehicle.
- California became the first state to require website browsers to include built-in functionality for opt-out preference signals under the Opt Me Out Act, which further amends the California Consumer Privacy Act (CCPA). Sponsored by the CPPA, this law takes effect on January 1, 2027.
Despite many consumer privacy law amendments, perhaps the most significant U.S. privacy law development of 2025 was California’s long-anticipated CCPA rulemaking. The CPPA (now styling itself CalPrivacy) finalized regulations on automated decision-making technology (ADMT), risk assessments and cybersecurity audits along with updates to existing CCPA rules. These regulations begin to take effect January 1, 2026, with phased deadlines for ADMT, risk assessments and cybersecurity audits. In particular, the ADMT rules were heavily debated. Ultimately, companies using ADMT for major decisions like lending, housing, education or employment in ways that replace or substantially replace human decision-making must provide clear notices, allow opt-outs or appeals, offer access rights, and conduct risk assessments.
Other states also advanced rulemaking. Colorado amended its Privacy Act rules to clarify obligations under new children’s privacy provisions, define design features that increase minors’ engagement, and update the definition of “revealing” to include precise geolocation data, which is now included in the act’s definition of sensitive personal data under recent amendments. New Jersey released draft regulations for its Data Privacy Act, modeled on Colorado’s Privacy Act rules but with notable departures from both the Colorado rules and New Jersey’s Data Privacy Act itself. Public comments frequently noted inconsistency with and expansion of the statutory text, failures to consider state law interoperability, and technological challenges. Public comment on New Jersey’s draft closed in August, and final rules are pending.
Privacy Litigation – CIPA and Online Tracking
A year ago, we wrote about the growing wave of civil claims tied to cookies, pixels and other website tracking technologies. That wave shows no sign of cresting. At the heart of this trend is the California Invasion of Privacy Act (CIPA), a statute originally enacted in 1967 to address wiretapping of phone communications, now repurposed by plaintiffs’ firms and their “tester” plaintiffs to challenge online tracking practices as unauthorized recordings or interceptions of online communications. In 2025, plaintiffs continued to refine their theories, expanding the CIPA provisions invoked and adding claims under other laws. These suits now target a wide range of technologies, from AI chatbots and session replay tools to search bars and standard analytics or advertising pixels.
The persistence of these claims stems in part from the lack of definitive court rulings on whether and how CIPA applies to online tracking technologies. While many cases have been dismissed, plaintiffs are leveraging decisions that allow claims to survive early motions and are using tactics like mass arbitration to avoid unfavorable precedent. We have also seen a significant uptick in demand letters for settlements before filing claims.
A proposed bill that would have amended CIPA to exempt use of online tracking technologies when used in compliance with the CCPA stalled in 2025 in the Assembly Committee on Privacy and Consumer Protection, leaving businesses exposed. Until such reforms pass, companies should expect continued privacy litigation risks and consider proactive measures to manage their use of tracking technologies, such as considering the deployment of consent banners and updating website terms.
TCPA and Text Message Program Compliance Challenges
Telemarketing law compliance has long been a hotbed of class action litigation, and 2025 was no exception. Although text messaging has taken center stage in recent years, many companies still use more traditional telephone calls for customer outreach and thus also need to be aware of litigation trends in this area. For example, using AI to create outbound voice messaging requires the same level of consent as would be needed for the more old-school prerecorded or “robot” voices of the past and may indeed necessitate additional disclosures about the use of AI more generally.
Adding to an already hectic compliance landscape, the Supreme Court’s June 20 decision in McLaughlin Chiropractic Associates, Inc. v. McKessoninjected yet more ambiguity by finding that district courts are no longer bound by the Federal Communications Commission’s (FCC) interpretation of the Telephone Consumer Protection Act (TCPA) – a corpus of regulations and rulemaking spanning more than three decades and thousands of pages. Almost immediately after that decision, courts around the country began issuing divergent opinions on what had long been accepted as a basic determination by the FCC that text messages are covered as “calls” subject to the TCPA. In addition, in January 2025 the FCC pulled back on enforcement of their proposed December 2023 rulemaking that had aimed to close the so-called “lead generator loophole,” but was subsequently struck down by the 11th Circuit. Ongoing uncertainty regarding how the FCC will respond to industry pressure to revisit TCPA rulemaking, as well as pending court cases calling into question the basics of TCPA compliance under prior FCC orders, have set the stage for a tumultuous 2026 for text and telephone marketing.
While the bulk of litigation and enforcement in this area occurs at the federal level, in the past five years more than a dozen states have enacted and amended their own “mini-TCPA” laws, which may apply to businesses operating in those states, calls or texts made to residents of those states, or both. Some of the requirements under these state laws align with the TCPA, but others add obligations and potential liabilities that should be taken into account when designing a compliance program. For example, Texas recently amended its telephone solicitation law, effective September 1, 2025, to expand potential private rights of action and to explicitly cover texts, images, and similar transmissions meant to induce a purchase or transaction. Although the Texas law appears to apply broadly to any company marketing to Texas residents, it also contains numerous exemptions that may reduce the compliance burden on most retailers and many other types of businesses that fall under one or more exception.
AI and the Privacy Principles Driving State Action
In 2025, states shifted from broad AI frameworks to targeted, use-case-specific rules. These laws focused on healthcare AI, chatbots and generative AI, while introducing accountability tools like regulatory sandboxes and affirmative defenses. Most measures tied closely to consumer protection, targeting automated decision-making and high-risk uses in employment, lending, housing and healthcare. Regulators also flagged emerging risks such as algorithmic pricing discrimination, AI companions and agentic AI capable of autonomous decision-making.
States increasingly applied traditional privacy concepts, such as notice and choice, to AI regulation. States also emphasized transparency and accountability through disclosures, impact assessments and governance measures to mitigate bias and privacy risks. California’s CPPA finalized its ADMT rules, effective January 1, 2027, requiring notices, opt-out or appeals rights, access rights, and risk assessments for ADMT uses that replace or substantially replace human judgment. Connecticut’s privacy law amendment introduced similar obligations, and New Jersey’s draft rules address automated systems producing legal or significant effects on consumers. Utah narrowed its generative AI law to cover only high-risk consumer-facing interactions, specifically those tied to financial, legal or medical decisions. These developments signal a possible trend toward aligning AI oversight with existing privacy frameworks, ensuring personal data processed by AI is subject to purpose limitations, consent requirements and risk-based safeguards.
Looking ahead to 2026, businesses using AI for significant decisions should anticipate state-level obligations around notice, opt-out rights and algorithmic accountability. At the same time, federal policymakers are pushing for uniformity. In December 2025, the White House issued an executive order aimed at removing state law barriers to a national AI policy, citing concerns that fragmented state rules could stifle innovation and create compliance burdens. It is uncertain whether federal agencies will act quickly to implement aspects of this executive order. States are also likely to challenge its validity. Businesses should monitor the state and federal AI dynamic in 2026 to inform AI strategy.
International Data Protection – China, India and the EU’s Digital Refresh
The world’s most populous country finally has an operational privacy law. India’s Digital Personal Data Protection Act (DPDPA) at last became effective on November 13, 2025, when the final rules for implementation were issued. The DPDPA introduces a major new component for global privacy compliance programs. Key obligations include detailed notice and consent requirements, strict retention limits, mandatory breach notifications, and enhanced compliance duties for certain entities. The DPDPA applies broadly to personal data processed in India or related to services offered to individuals in India. Critically, it can cover processing outsourced by foreign companies to Indian vendors. This means U.S. businesses using Indian service providers must ensure those vendors comply with DPDPA obligations, including consent requirements, retention limits and breach notifications. Contracts should clearly allocate compliance responsibilities, and companies may need to check whether their Indian processors are subject to additional requirements, such as annual audits and appointing a local data protection officer. With a phased implementation window that sunsets at the 18-month mark, companies operating in or outsourcing to India should begin planning now, as these requirements can differ significantly from other established privacy and data protection frameworks.
The world’s third-most-populous country is now restricting data access from the second-most-populous country (and others). In April 2025, the U.S. Department of Justice’s (DOJ) National Security Division launched the Data Security Program (DSP) under Executive Order 14117 to limit foreign access by “countries of concern,” such as China, to U.S. government-related and bulk sensitive personal data. The rule categorizes certain foreign data transfers as either prohibited or restricted, with various compliance obligations applying to restricted transfers. Prohibitions and certain obligations took effect immediately, while enhanced due diligence, reporting and audit requirements for restricted transactions began October 6, 2025. Restricted transactions must also comply with security measures established by the Cybersecurity and Infrastructure Security Agency (CISA). Violations can result in significant civil and criminal penalties. The DOJ has issued a compliance guide and FAQs. All companies, even those that do not believe they handle “covered data,” should conduct a “know your data” assessment to map data flows, identify exposure to covered persons or countries of concern, and implement internal controls aligned with the DSP’s prohibited and restricted transaction rules.
Finally, the world’s most influential data protection regime may be getting some updates. In November 2025, the European Commission launched its Digital Omnibus initiative to reduce compliance fragmentation and improve interoperability across the European Union’s General Data Protection Regulation, AI Act, Data Act, ePrivacy Directive, NIS2 and other digital laws. If adopted, U.S. companies operating in the European Union could see a more streamlined and predictable compliance landscape. Key proposals include clarifying the definition of personal data, simplifying cookie consent, and introducing a single entry point for incident notifications across privacy and cybersecurity frameworks. In parallel, the AI Act Digital Omnibus aims to ease compliance burdens and foster innovation by extending deadlines for high-risk AI systems, introducing a “stop-the-clock” mechanism tied to receiving regulatory guidance, simplifying registration for lower-risk systems, and expanding regulatory sandboxes. But hold off on celebrating just yet! These proposals are part of the broader Digital Fitness Check, which will extend through 2026, and they must still pass through the European Parliament and Council. If enacted, the proposals could mean fewer overlapping reporting requirements.
What Else to Know?
- App Store Age Assurance Laws – In 2025, several states advanced age assurance laws aimed at strengthening protections for minors online. California passed the Digital Age Assurance Act, requiring operating systems to collect users’ age information and transmit age signals to apps starting in 2027. Louisiana, Texas and Utah enacted similar age assurance laws, which, starting with Texas’ on January 1, 2026, will require app stores to verify age, categorize users into age brackets, and enable parental oversight of teen and child accounts with flow-down obligations on app developers. These measures represent a significant shift toward platform-level responsibility for age verification and parental controls.
- Other Developments in Minors’ Privacy – Beyond age assurance, states continued to expand minors’ privacy protections in 2025. Arkansas, Nebraska and Vermont passed age-appropriate design code laws, though their provisions vary widely, creating compliance complexity. Montana and Oregon amended privacy statutes to include “willful disregard” standards for identifying minors, while Colorado and Connecticut strengthened consent requirements for minor data. California also revised its definition of sensitive personal information to include data from consumers under 16. These changes underscore a growing patchwork of obligations for businesses handling minors’ data. At the federal level, the FTC finalized new rulemaking under the Children’s Online Privacy Protection Act.
- Data Brokers – In 2025, data brokers faced increased scrutiny. California finalized Delete Act regulations, began enforcement for non-registration, and advanced a centralized Delete Request and Opt-Out Platform. Texas amended its data broker law twice, adding notice requirements for consumer rights under the Texas Data Privacy and Security Act and broadening the definition of “data broker” to include any entity handling personal data not collected directly from the individual, removing the prior revenue-based limitation.