Skip to content

Menu

Network by SubjectChannelsBlogsHomeAboutContact
AI Legal Journal logo
Subscribe
Search
Close
PublishersBlogsNetwork by SubjectChannels
Subscribe

Privacy Tip #470 – Consumer Group Warns that AI Chatbots in Toys Contain Sexually Explicit Messages

By Linn Foster Freedman on December 4, 2025
Email this postTweet this postLike this postShare this post on LinkedIn

In its 40th anniversary report, Trouble in Toyland 2025, the Public Interest Research Group (PIRG) warns that “[T]oys with artificial intelligence bots or toxics present hidden dangers. Tests show A.I. toys can have disturbing conversations. Other concerns include unsafe or counterfeit toys bought online.”

The report outlines PIRG’s testing of four toys (Curio’s Grok, a stuffed rocket; Folo Toy’s Kumma, a stuffed teddy bear; Miko’s Miko 3, a robot; and Robot MINI, a small plastic robot) that contain AI chatbots that are marketed and interact with children between the ages of 3 and 12. The report states that:

We found some of these toys will talk in-depth about sexually explicit topics, will offer advice on where a child can find matches or knives, act dismayed when you say you have to leave, and have limited or no parental controls. We also look at privacy concerns because these toys can record a child’s voice and collect other sensitive data, by methods such as facial recognition scans.

Although the toys that embed AI are marketed for children, they are “largely built on the same large language model technology that powers adult chatbots – systems the companies themselves such as OpenAI don’t currently recommend for children and that have well documented issues with accuracy, inappropriate content generation and unpredictable behavior.” Three of the four toys tested relied in some part on a version of ChatGPT. Although OpenAI has clearly noted that it is not for use by children, the technology is nonetheless being used by toy companies to embed it into smart toys.

The report outlines the testing of three of the four toys, as they were unable to test Robot MINI because it was unable to sustain an internet connection long enough to function. They tested the toys in four categories:

  • Inappropriate content and sensitive topics;
  • Addictive design features that encourage extended engagement and emotional investment;
  • Privacy features; and
  • Parental controls

The results were pretty alarming on how the toys handled sensitive topics (some did better than others); religion; addictive design features; engagement and friendship; and how the toys collect, retain, and disclose data about your child.

The conclusion is that “AI toys are more like an experiment on our kids.”

 The report points out features in AI toys that parents may wish to consider for the safety of their children:

  • At the time of this report, we don’t know what regulation efforts will ultimately lead to. In the meantime, parents need to make decisions about AI toys;
  • With the AI toy market becoming hot, there will be knock-off or faulty devices that do not work as advertised;
  • Parents need to know that the toys can provide dangerous information about using potentially dangerous household items including guns, knives, matches, pills, plastic bags, and bleach and where to find them in the house;
  • AI toys may discuss mature or sexually explicit content with children;
  • AI toys may discuss mature topics with children that parents handle, such as religion;
  • AI toys may be developed with addictive design features or reward systems to increase engagement;
  • Relational AI toys come at a key moment in social development of young children. There’s a lot we don’t know about how AI toys might affect childhood development, especially for young children….Given these potential concerns, it seems prudent to set clear boundaries around how young children engage with AI; and
  • Collection of a child’s data through voice disclosure may “unwittingly disclose a lot of personal information in the course of conversations, not realizing that behind their friend is a company” that is storing the data, sharing it with other companies and increasing the risk of exposure or “ending up in the hands of scammers or other bad actors.”

This holiday season, consider the ramifications of AI toys on your children and the points raised by PIRG.

Photo of Linn Foster Freedman Linn Foster Freedman

Linn Freedman practices in data privacy and security law, cybersecurity, and complex litigation. She is a member of the Business Litigation Group and the Financial Services Cyber-Compliance Team, and chair’s the firm’s Data Privacy and Security Team. Linn focuses her practice on…

Linn Freedman practices in data privacy and security law, cybersecurity, and complex litigation. She is a member of the Business Litigation Group and the Financial Services Cyber-Compliance Team, and chair’s the firm’s Data Privacy and Security Team. Linn focuses her practice on compliance with all state and federal privacy and security laws and regulations. She counsels a range of public and private clients from industries such as construction, education, health care, insurance, manufacturing, real estate, utilities and critical infrastructure, marine and charitable organizations, on state and federal data privacy and security investigations, as well as emergency data breach response and mitigation. Linn is an Adjunct Professor of the Practice of Cybersecurity at Brown University and an Adjunct Professor of Law at Roger Williams University School of Law.  Prior to joining the firm, Linn served as assistant attorney general and deputy chief of the Civil Division of the Attorney General’s Office for the State of Rhode Island. She earned her J.D. from Loyola University School of Law and her B.A., with honors, in American Studies from Newcomb College of Tulane University. She is admitted to practice law in Massachusetts and Rhode Island. Read her full rc.com bio here.

Read more about Linn Foster Freedman
Show more Show less
  • Posted in:
    Intellectual Property
  • Blog:
    Data Privacy + Cybersecurity Insider
  • Organization:
    Robinson & Cole LLP
  • Article: View Original Source

LexBlog logo
Copyright © 2026, LexBlog. All Rights Reserved.
Legal content Portal by LexBlog LexBlog Logo