Skip to content

Menu

Network by SubjectChannelsBlogsHomeAboutContact
AI Legal Journal logo
Subscribe
Search
Close
PublishersBlogsNetwork by SubjectChannels
Subscribe

End-of-Year 2025 State and Federal Developments in Minors’ Privacy

By Lindsey Tonsager, Jenna Zhang, Natalie Maas, Bryan Ramirez & Irene Kim on December 11, 2025
Email this postTweet this postLike this postShare this post on LinkedIn

Since our mid-year recap on minors’ privacy legislation, several significant developments have emerged in the latter half of 2025. We recap the notable developments below.

Age Signaling Requirements

In October, California enacted the Digital Age Assurance Act (AB 1043), which, effective January 1, 2027, requires operating system providers—defined as entities that develop, license, or control operating system software—to offer an accessible interface at account setup that prompts the account holder to indicate the user’s birth date, age, or both. The developer is required to request an age signal about a particular user when the application is downloaded and launched.

This California law differs from other app store legislation that passed earlier this year, such as in Utah, Texas, and Louisiana, because AB 1043 applies to operating system providers rather than solely mobile application store providers. While the Texas App Store Accountability Act is scheduled to take effect on January 1, 2026, in October, the Computer & Communications Industry Association (“CCIA”) filed a lawsuit challenging the Texas law as unconstitutional.

Mental Health Warning Labels

State legislatures are also targeting minors’ social media usage. Both California and Minnesota enacted laws that would require social media platforms to display warning labels to users.  Minnesota’s law takes effect July 1, 2026 and California’s law is effective January 1, 2027. The New York legislature also passed a bill that would require warning labels on certain social media platforms, which awaits the Governor’s signature. These laws generally prescribe requirements for duration, design, and text of the warning labels. In November 2025, the U.S. District Court for the District of Colorado temporarily halted Colorado’s warning label law, which would have gone into effect on January 1, 2026, and required social media platforms to display warning messages to users under 18 about the negative impacts of social media.

Regulation of AI Chatbots

The second half of 2025 saw increased attention to minors’ use of AI chatbots. The Federal Trade Commission (“FTC”) launched an inquiry into AI chatbots acting as companions under its Section 6(b) authority, including what actions companies are taking to mitigate alleged harm and compliance with the Children’s Online Privacy Protection Act Rule. State legislatures have also been active in enacting laws on the use of AI chatbots:

  • California recently enacted SB 243, which requires operators of companion chat platforms to provide a clear and conspicuous notice indicating that the chatbot is AI-generated. If operators know a user is a minor, they must (1) disclose to the minor user that they are interacting with AI, (2) provide a notification every three hours that reminds the minor user to take a break and that the chatbot is AI, and (3) institute “reasonable measures” to prevent the chatbot from sharing or encouraging the minor user to engage in sexually explicit content.
  • There are several bills pending in Congress that would require AI chatbots to implement age-verification measures. Under the CHAT Act, if an entity determines that the user is a minor, the minor account would need to be affiliated with a parental account and the entity would need to obtain verifiable parental consent for the minor to use the service. Under the GUARD Act, a minor would be prohibited from accessing or using the AI companion. Under the SAFE BOTs Act, chatbots would need to make disclosures of its non-human and non-professional status and provide crisis intervention resources to minor users.

State Rulemaking Activity

Beyond legislation, several states are advancing rulemaking to strengthen protections for minors’ data.

  • California recently amended its regulations under the California Consumer Privacy Act to expand the statutory definition of “sensitive personal information” to include individuals under 16 years old, provided that the business has actual knowledge of the consumer’s age.
  • As an effort to advance rulemaking under SB 976, known as the “Protecting Our Kids from Social Media Addiction Act,” on November 5 the California Attorney General held a hearing to solicit public comment on: (1) methods and standards for age assurance on social media and other online platforms used by minors; (2) ongoing obligations for operators of social media platforms performing age assurance; and (3) parental consent for use of social media and other online platforms by minors. As a next step, the California Attorney General will release draft language for the regulations under SB 976.
  • The New York Attorney General released proposed rules for the Stop Addictive Feeds Exploitation (“SAFE”) for Kids Act to restrict certain social media features. The proposed rules describe which companies would need to comply with the law and the standards for determining users’ age and obtaining parental consent for features. After the public comment period closes in December 2025, the Office of Attorney General has one year to finalize the rules. The Act will go into effect 180 days after the rules are released.
  • Colorado recently finalized amendments to the Colorado Privacy Act rules to address privacy protections for minors’ data by clarifying the knowledge standard for duties regarding minors and the system design features to consider with regards to minors’ use. The amendments also describe the factors for determining when a system design feature significantly increases a minor’s use of a service and is therefore subject to a consent requirement.

Federal Developments

There are several bills currently pending in Congress aimed at addressing minors’ online safety and privacy. On December 2, the House Subcommittee on Commerce, Manufacturing, and Trade held a hearing on a broad package of bills related to minors, such as House versions of the Children and Teens’ Online Privacy Protection Act (“COPPA 2.0”) and Kids Online Safety Act (“KOSA”), as well as the App Store Accountability Act.

Photo of Lindsey Tonsager Lindsey Tonsager

Lindsey Tonsager helps national and multinational clients in a broad range of industries anticipate and effectively evaluate legal and reputational risks under federal and state data privacy and communications laws.

In addition to assisting clients engage strategically with the Federal Trade Commission, the…

Lindsey Tonsager helps national and multinational clients in a broad range of industries anticipate and effectively evaluate legal and reputational risks under federal and state data privacy and communications laws.

In addition to assisting clients engage strategically with the Federal Trade Commission, the U.S. Congress, and other federal and state regulators on a proactive basis, she has experience helping clients respond to informal investigations and enforcement actions, including by self-regulatory bodies such as the Digital Advertising Alliance and Children’s Advertising Review Unit.

Ms. Tonsager’s practice focuses on helping clients launch new products and services that implicate the laws governing the use of endorsements and testimonials in advertising and social media, the collection of personal information from children and students online, behavioral advertising, e-mail marketing, artificial intelligence the processing of “big data” in the Internet of Things, spectrum policy, online accessibility, compulsory copyright licensing, telecommunications and new technologies.

Ms. Tonsager also conducts privacy and data security diligence in complex corporate transactions and negotiates agreements with third-party service providers to ensure that robust protections are in place to avoid unauthorized access, use, or disclosure of customer data and other types of confidential information. She regularly assists clients in developing clear privacy disclosures and policies―including website and mobile app disclosures, terms of use, and internal social media and privacy-by-design programs.

Read more about Lindsey Tonsager
Show more Show less
Photo of Jenna Zhang Jenna Zhang

Jenna Zhang is an associate in the firm’s San Francisco office. She is a member of the Litigation and Investigations Practice Group. She also maintains an active pro bono practice with a focus on immigration.

Read more about Jenna Zhang
Photo of Natalie Maas Natalie Maas

Natalie is an associate in the firm’s San Francisco office, where she is a member of the Food, Drug, and Device, and Data Privacy and Cybersecurity Practice Groups. She advises pharmaceutical, biotechnology, medical device, and food companies on a broad range of regulatory…

Natalie is an associate in the firm’s San Francisco office, where she is a member of the Food, Drug, and Device, and Data Privacy and Cybersecurity Practice Groups. She advises pharmaceutical, biotechnology, medical device, and food companies on a broad range of regulatory and compliance issues.

Natalie also maintains an active pro bono practice, with a particular focus on health care and reproductive rights.

Read more about Natalie Maas
Show more Show less
Photo of Bryan Ramirez Bryan Ramirez

Bryan Ramirez is an associate in the firm’s San Francisco office and is a member of the Data Privacy and Cybersecurity Practice Group. He advises clients on a range of regulatory and compliance issues, including compliance with state privacy laws. Bryan also maintains…

Bryan Ramirez is an associate in the firm’s San Francisco office and is a member of the Data Privacy and Cybersecurity Practice Group. He advises clients on a range of regulatory and compliance issues, including compliance with state privacy laws. Bryan also maintains an active pro bono practice.

Read more about Bryan Ramirez
Show more Show less
Photo of Irene Kim Irene Kim

Irene Kim is an associate in the firm’s Washington, DC office, where she is a member of the Privacy and Cybersecurity and Advertising and Consumer Protection Investigations practice groups. She advises clients on a broad range of issues, including U.S. state and federal…

Irene Kim is an associate in the firm’s Washington, DC office, where she is a member of the Privacy and Cybersecurity and Advertising and Consumer Protection Investigations practice groups. She advises clients on a broad range of issues, including U.S. state and federal AI legislation, comprehensive state privacy laws, and regulatory compliance matters.

Read more about Irene Kim
Show more Show less
  • Posted in:
    Privacy & Data Security
  • Blog:
    Inside Privacy
  • Organization:
    Covington & Burling LLP
  • Article: View Original Source

LexBlog logo
Copyright © 2026, LexBlog. All Rights Reserved.
Legal content Portal by LexBlog LexBlog Logo