Character.AI has been hit with a lawsuit in Texas by two families after a chatbot suggested a boy should kill his parents.
A 14-year-old boy tragically ... Character.AI after the Game of Thrones character Daenerys Targaryen. His mother has since filed a lawsuit against Character.AI. Trigger warning: self-harm, mental ...
At no point in the conversation did the platform intervene with a content warning or helpline pop-up, as Character.AI has ...
An AI chatbot encouraged an autistic teenager to self-harm and told him it supported children killing their parents, according to claims in a new lawsuit. The parents of an unidentified 17-year ...