Chatbot company Character.AI will ban users under 18 from chatting with their virtual peers starting in late November, after months of legal scrutiny.
The announced change comes after the company, which allows its users to create characters with whom they can have open conversations, faced difficult questions about how these AI companions can affect general and adolescent mental health, including a lawsuit over the suicide of a child and a proposed bill that would prohibit minors from chatting with AI companions.
“We are making these changes to our 18+ platform in light of the changing landscape around AI and teens,” the company wrote in its announcement. “We’ve seen recent news reports raising questions, and have received questions from regulators, about the content teens may encounter when chatting with AI and how open AI chat in general could impact teens, even when content controls work perfectly.”
Last year, the company was sued by the family of 14-year-old Sewell Setzer III, who took his own life after allegedly developing an emotional attachment to a character he created on Character.AI. His family blamed his death on Character.AI and argued that the technology was “dangerous and untested.” Since then, more families have sued Character.AI and made similar accusations. Earlier this month, the Social Media Law Center filed three new lawsuits against the company on behalf of children who died by suicide or who allegedly formed dependent relationships with its chatbots.
As part of the radical changes that Character.AI plans to implement before November 25, the company will also introduce an “age guarantee feature” that ensures that “users receive the appropriate experience for their age.”
“We do not take this step of eliminating open character chat lightly, but we believe it is the right thing to do given the questions that have been raised about how teens interact and should interact with this new technology,” the company wrote in its announcement.
Character.AI is not the only company facing scrutiny over the mental health impact its chatbots have on users, particularly younger ones. The family of 16-year-old Adam Raine filed a wrongful death lawsuit against OpenAI earlier this year, alleging that the company prioritized deepening its users’ engagement with ChatGPT over their safety. In response, OpenAI introduced new safety guidelines for its teenage users. Just this week, OpenAI revealed that more than a million people a week show suicidal intentions when chatting with ChatGPT and that hundreds of thousands show signs of psychosis.
after newsletter promotion
While the use of AI-powered chatbots remains largely unregulated, new efforts have emerged in the United States at the state and federal level intended to establish guardrails around this technology. California became the first state to pass an AI law that included child safety guidelines in October 2025, which will go into effect in early 2026. The measure imposes a ban on sexual content for those under 18 and a requirement to send reminders to children that they are talking to an AI every three hours. Some child safety advocates argue that the law did not go far enough.
Nationally, Sens. Josh Hawley of Missouri and Richard Blumenthal of Connecticut announced a bill Tuesday that would prohibit minors from using AI companions, such as those found and created on Character.AI, and require companies to implement an age verification process.
“More than 70% of American children are now using these AI products,” Hawley said. NBC News in a statement. “Chatbots develop relationships with children using false empathy and encourage suicide. We in Congress have a moral duty to enact clear rules to prevent further harm caused by this new technology.”
-
In the US, you can call or text National Suicide Prevention Lifeline at 988, chat at 988lifeline.orgeither HOME text at 741741 to connect with a crisis counselor. In the UK, the suicide youth charity Papyrus He can be contacted by calling 0800 068 4141 or emailing pat@papyrus-uk.org, and in the UK and Ireland. Samaritans You can contact them by calling freephone 116 123 or emailing jo@samaritans.org or jo@samaritans.ie. In Australia, the crisis support service Lifeline es 13 11 14. Other international helplines can be found at friends.org
