Skip to content

Menu

Network by SubjectChannelsBlogsHomeAboutContact
AI Legal Journal logo
Subscribe
Search
Close
PublishersBlogsNetwork by SubjectChannels
Subscribe

Privacy Tip #464 – Pitfalls of Dating a Bot

By Linn Foster Freedman on October 16, 2025
Email this postTweet this postLike this postShare this post on LinkedIn

Dating sure has changed since I was in the market decades ago. Some of us can’t imagine online dating, let alone dating a bot. Get over it—it’s now reality.

According to Vantage Point, a counseling company located in Texas, it surveyed 1,012 adults and a whopping 28% of them admitted to having “at least one intimate or romantic relationship with an AI system.” Vantage Point recently released “Artificial Romance: A Study of AI and Human Relationships,” that found:

  • 28.16% of adults claim to have at least one intimate or romantic relationship with an AI.
  • Adults 60 years and older are more likely to consider intimate relationships with AI as not cheating.
  • More than half of Americans claim to have some kind of relationship with an AI system.
  • ChatGPT is the #1 AI platform adults feel they have a relationship with, Amazon’s Alexa is #3, Apple’s Siri is #4, and Google’s Gemini is #5.
  • Adults currently in successful relationships are more likely to pursue an intimate or romantic relationship with an Artificial Intelligence.

The article explores whether having an intimate or romantic relationship with a bot is cheating on your partner or not, which we will not delve into here. The point is that it appears that a lot of adults are involved in relationships with bots.

According to Gizmo, younger generations, including 23% of Millennials and 33% of Gen Z report having romantic interactions with AI.

For adults, the pitfalls and “dangers” associated with dating a bot are thoroughly outlined in an informative article in Psychology Today. Some of the experts believe that dating a bot:

  • Threatens our ability to connect and collaborate in all areas of life.
  • In most cases, users actually create the characteristics, both physical and “emotional,” that they want in their bot. Some users lose interest in real-world dating because of intimidation, inadequacy, or disappointment.
  • AI relationships will potentially displace some human relationships and lead young men to have unrealistic expectations about real-world partners.
  • Sometimes the bots are manipulative and can be destructive. This can lead to feelings of depression, which can lead to suicidal behavior.

What is more alarming is the “astonishing proportion of high schoolers [who] have had a ‘romantic’ relationship with an AI” bot. According to the article by the same name, “this should worry you.”

Presently, one in five high school students say that “they or a friend have used AI to have a romantic relationship” according to a recent report from the Center for Democracy and Technology. This is consistent with other studies noting the high percentage of teens that are forming relationships with AI bots. The concerns for youngsters forming relationships with bots include the fact that they can “give dangerous advice to teens…encourage suicide, explaining how to self-harm, or hide eating disorders. Numerous teens have died by suicide after developing a close and sometimes romantic relationship with a chatbot.”

The Report found that 42% of highschoolers use AI “as a friend, or to get mental health support, or to escape from real life.” Additionally, 16% percent say they converse with an AI bot every day. It is also being used for AI-fabricated revenge porn, producing deepfakes, sexual harassment and bullying.

Like social media usage, parents need to be aware of the prevalence of kids interacting with AI bots for romantic relationships or mental health advice, and discuss the risks presented with them.

Photo of Linn Foster Freedman Linn Foster Freedman

Linn Freedman practices in data privacy and security law, cybersecurity, and complex litigation. She is a member of the Business Litigation Group and the Financial Services Cyber-Compliance Team, and chair’s the firm’s Data Privacy and Security Team. Linn focuses her practice on…

Linn Freedman practices in data privacy and security law, cybersecurity, and complex litigation. She is a member of the Business Litigation Group and the Financial Services Cyber-Compliance Team, and chair’s the firm’s Data Privacy and Security Team. Linn focuses her practice on compliance with all state and federal privacy and security laws and regulations. She counsels a range of public and private clients from industries such as construction, education, health care, insurance, manufacturing, real estate, utilities and critical infrastructure, marine and charitable organizations, on state and federal data privacy and security investigations, as well as emergency data breach response and mitigation. Linn is an Adjunct Professor of the Practice of Cybersecurity at Brown University and an Adjunct Professor of Law at Roger Williams University School of Law.  Prior to joining the firm, Linn served as assistant attorney general and deputy chief of the Civil Division of the Attorney General’s Office for the State of Rhode Island. She earned her J.D. from Loyola University School of Law and her B.A., with honors, in American Studies from Newcomb College of Tulane University. She is admitted to practice law in Massachusetts and Rhode Island. Read her full rc.com bio here.

Read more about Linn Foster Freedman
Show more Show less
  • Posted in:
    Intellectual Property
  • Blog:
    Data Privacy + Cybersecurity Insider
  • Organization:
    Robinson & Cole LLP
  • Article: View Original Source

LexBlog logo
Copyright © 2026, LexBlog. All Rights Reserved.
Legal content Portal by LexBlog LexBlog Logo