Families are suing chatbot-maker Character Technologies and Google after chatbots allegedly encouraged self-harm and violence.
A lawsuit was filed in Texas by families who claim that C.AI chatbots groomed kids and suggested acts of self-harm and even murder.
One 17-year-old boy with autism was allegedly told by the chatbots that murdering his parents was a reasonable response to their imposing time limits on his online activity.
The affected families still fear the teen's violent outbursts, even a year after being cut off from the app.