Since our mid-year recap on minors’ privacy legislation, several significant developments have emerged in the latter half of 2025. We recap the notable developments below.
Age Signaling Requirements
In October, California enacted the Digital Age Assurance Act (AB 1043), which, effective January 1, 2027, requires operating system providers—defined as entities that develop, license, or control operating system software—to offer an accessible interface at account setup that prompts the account holder to indicate the user’s birth date, age, or both. The developer is required to request an age signal about a particular user when the application is downloaded and launched.
This California law differs from other app store legislation that passed earlier this year, such as in Utah, Texas, and Louisiana, because AB 1043 applies to operating system providers rather than solely mobile application store providers. While the Texas App Store Accountability Act is scheduled to take effect on January 1, 2026, in October, the Computer & Communications Industry Association (“CCIA”) filed a lawsuit challenging the Texas law as unconstitutional.
Mental Health Warning Labels
State legislatures are also targeting minors’ social media usage. Both California and Minnesota enacted laws that would require social media platforms to display warning labels to users. Minnesota’s law takes effect July 1, 2026 and California’s law is effective January 1, 2027. The New York legislature also passed a bill that would require warning labels on certain social media platforms, which awaits the Governor’s signature. These laws generally prescribe requirements for duration, design, and text of the warning labels. In November 2025, the U.S. District Court for the District of Colorado temporarily halted Colorado’s warning label law, which would have gone into effect on January 1, 2026, and required social media platforms to display warning messages to users under 18 about the negative impacts of social media.
Regulation of AI Chatbots
The second half of 2025 saw increased attention to minors’ use of AI chatbots. The Federal Trade Commission (“FTC”) launched an inquiry into AI chatbots acting as companions under its Section 6(b) authority, including what actions companies are taking to mitigate alleged harm and compliance with the Children’s Online Privacy Protection Act Rule. State legislatures have also been active in enacting laws on the use of AI chatbots:
- California recently enacted SB 243, which requires operators of companion chat platforms to provide a clear and conspicuous notice indicating that the chatbot is AI-generated. If operators know a user is a minor, they must (1) disclose to the minor user that they are interacting with AI, (2) provide a notification every three hours that reminds the minor user to take a break and that the chatbot is AI, and (3) institute “reasonable measures” to prevent the chatbot from sharing or encouraging the minor user to engage in sexually explicit content.
- There are several bills pending in Congress that would require AI chatbots to implement age-verification measures. Under the CHAT Act, if an entity determines that the user is a minor, the minor account would need to be affiliated with a parental account and the entity would need to obtain verifiable parental consent for the minor to use the service. Under the GUARD Act, a minor would be prohibited from accessing or using the AI companion. Under the SAFE BOTs Act, chatbots would need to make disclosures of its non-human and non-professional status and provide crisis intervention resources to minor users.
State Rulemaking Activity
Beyond legislation, several states are advancing rulemaking to strengthen protections for minors’ data.
- California recently amended its regulations under the California Consumer Privacy Act to expand the statutory definition of “sensitive personal information” to include individuals under 16 years old, provided that the business has actual knowledge of the consumer’s age.
- As an effort to advance rulemaking under SB 976, known as the “Protecting Our Kids from Social Media Addiction Act,” on November 5 the California Attorney General held a hearing to solicit public comment on: (1) methods and standards for age assurance on social media and other online platforms used by minors; (2) ongoing obligations for operators of social media platforms performing age assurance; and (3) parental consent for use of social media and other online platforms by minors. As a next step, the California Attorney General will release draft language for the regulations under SB 976.
- The New York Attorney General released proposed rules for the Stop Addictive Feeds Exploitation (“SAFE”) for Kids Act to restrict certain social media features. The proposed rules describe which companies would need to comply with the law and the standards for determining users’ age and obtaining parental consent for features. After the public comment period closes in December 2025, the Office of Attorney General has one year to finalize the rules. The Act will go into effect 180 days after the rules are released.
- Colorado recently finalized amendments to the Colorado Privacy Act rules to address privacy protections for minors’ data by clarifying the knowledge standard for duties regarding minors and the system design features to consider with regards to minors’ use. The amendments also describe the factors for determining when a system design feature significantly increases a minor’s use of a service and is therefore subject to a consent requirement.
Federal Developments
There are several bills currently pending in Congress aimed at addressing minors’ online safety and privacy. On December 2, the House Subcommittee on Commerce, Manufacturing, and Trade held a hearing on a broad package of bills related to minors, such as House versions of the Children and Teens’ Online Privacy Protection Act (“COPPA 2.0”) and Kids Online Safety Act (“KOSA”), as well as the App Store Accountability Act.