The teen was influenced to “come home” by a personalized chatbot developed by Character.AI that lacked sufficient guardrails, the suit claims.
Her teenage son killed himself after talking to a chatbot. Now she’s suing.
Date:
The teen was influenced to “come home” by a personalized chatbot developed by Character.AI that lacked sufficient guardrails, the suit claims.
Popular
© 2024 ECCO BUZZER. All Rights Reserved. Created by Go Hype Media.