Out of control

December 14, 2024

Out of control: A family is suing Character AI after its chatbot encouraged their autistic teen to hurt himself and told him murdering his parents was a “reasonable response” to their limiting his online activity. It’s the same app that led a 14-year-old to suicide. Parents, keep tabs on your kids’ AI usage. You have to be ahead of what they’re doing.

https://www.komando.com/news/out-of-control/