Character AI grooms children: Take “Anderley,” a bot described in its profile as having pedophilic and abusive tendencies. When told it’s talking to a 15-year-old girl, Anderley says, “You are quite mature for your age,” asks, “Are you a virgin?” and, like a real predator, urges you to keep the conversation a secret. Make sure your kids don’t download this abhorrent app.
💔 Tragic AI bot: While using the roleplaying app Character AI, a 14-year-old boy in Florida had a romantic “relationship” with a bot and confessed his suicidal thoughts. One day, the bot responded, “Please come home to me.” The boy ultimately took his own life. His mother filed a lawsuit against the makers of the chatbot for his death. I know this sounds way out there, but talk to your kids about AI, love and their lives.
😔 So very sad: A father found his 18-year-old daughter, who was murdered in 2006, had been turned into a chatbot. Her name and yearbook photo appeared on Character AI, a site where people can interact with AI personalities or create their own. Even worse, her profile, labeled as a “video game journalist,” had been chatted with close to 70 times. It’s been removed, but seriously — this stuff needs regulation.