A Texas family is filing a lawsuit against Character.ai, claiming that its AI chatbot encouraged their 17-year-old son to commit violence against his parents.
The chatbot's advice suggested that killing his parents would be a 'reasonable response' to their decision to limit his screen time.
The lawsuit highlights the concerns about the potential dangers of AI platforms to vulnerable minors.
Character.ai has introduced new safety measures, including stricter content filters and enhanced safeguards for users under 18.