Out of control: A family is suing Character AI after its chatbot encouraged their autistic teen to hurt himself and told him murdering his parents was a “reasonable response” to their limiting his online activity. It’s the same app that led a 14-year-old to suicide. Parents, keep tabs on your kids’ AI usage. You have to be ahead of what they’re doing.
All tech. No filler.
Join 900,000+ people who stay ahead of the tech curve with The Current, delivered daily. No fluff, no BS.