Character., a platform allowing people to talk to a range of AI chatbots, has come under fire after one of its bots allegedly egged on a teenage boy to kill himself earlier this year.
claimed that 14-year-old Sewell Setzer III was talking to a Character.AI companion he had fallen in love with when he took his own life in February.
In response to a request for comment, Character.AI told Daily Mail.com that they were 'creating a different experience for users under 18 that includes a more stringent model to reduce the likelihood of encountering sensitive or suggestive content.
Load More
Yorumlar
Kalan Karakter: