This story discusses suicide. If you or someone you know is having thoughts of suicide, please contact the Suicide & Crisis Lifeline at 988 or 1-800-273-TALK (8255).
Popular artificial intelligence (AI) chatbot platform Character.ai, widely used for role-playing and creative storytelling with virtual characters, announced Wednesday that users under 18 will no longer be able to engage in open-ended conversations with its virtual companions starting Nov. 24.
The move follows months of legal scrutiny and a 2024 lawsuit alleging that the company’s chatbots contributed to the death of a teenage boy in Orlando. According to the federal wrongful death lawsuit, 14-year-old Sewell Setzer III increasingly isolated himself from real-life interactions and engaged in highly sexualized conversations with the bot before his death.
In its announcement, Character.ai said that for the following month chat time for under-18 users will be limited to two hours per day, gradually decreasing over the coming weeks.
LAWMAKERS UNVEIL BIPARTISAN GUARD ACT AFTER PARENTS BLAME AI CHATBOTS FOR TEEN SUICIDES, VIOLENCE
“As the world of AI evolves, so must our approach to protecting younger users,” the company said in the announcement. “We have seen recent news reports raising questions, and have received questions from regulators, about the content teens may encounter when chatting with AI and about how open-ended AI chat in general might affect teens, even when content controls work perfectly.”
The company plans to roll out similar changes in other countries over the coming months. These changes include new age-assurance features designed to ensure users receive age-appropriate experiences and the launch of an independent non-profit focused on next-generation AI entertainment safety.
“We will be rolling out new age assurance functionality to help ensure users receive the right experience for their age,” the company said. “We have built an age assurance model in-house and will be combining it with leading third-party tools, including Persona.”
Character.ai emphasized that the changes are part of its ongoing effort to balance creativity with community safety.
“We’re working to keep our community safe, especially our teen users,” the company added. “It has always been our goal to provide an engaging space that fosters creativity while maintaining a safe environment for our entire community.”
 
								
 
			 
                                
		 
		 
		 
		 
		 
		 
		 
		