AI Platform Bans Teens From Chatting With AI-Generated Characters After Disturbing Lawsuits
The artificial intelligence platform, Character.AI, said on Wednesday that it would soon prohibit users under 18 from conversing with the platform’s AI-generated characters, often referred to as chatbots.
The announcement comes after multiple lawsuits were filed against Character.AI, alleging that the chatbots pushed children into sexual conversations and even led to one teen’s suicide.
Beginning on November 25, those under 18 will not have access to Character.AI’s chatbots, CNN reported. Until then, teens will be limited to two hours of chat time with the AI-generated characters.
“We do not take this step of removing open-ended Character chat lightly – but we do think that it’s the right thing to do given the questions that have been raised about how teens do, and should, interact with this new technology,” Character.AI said in a statement.
The company’s CEO, Karandeep Anand, told The New York Times, “We’re making a very bold step to say for teen users, chatbots are not the way for entertainment, but there are much better ways to serve them.”
Character.AI, headquartered in Silicon Valley, allows users to create videos and stories and “chat with millions of AI characters,” and users can also customize their own characters.
The most concerning lawsuit against the AI company was filed by the family of 14-year-old Sewell Setzer III from Orlando, Florida. Setzer’s family said that the boy started using Character.AI and grew attached to a chatbot named Dany, with conversations turning romantic and sexual, The New York Times reported. Setzer, who was an excellent student, started receiving bad grades and became more secluded, preferring to spend time alone on his phone, his family said.
“It’s like you’re having a sexting conversation back and forth, except it’s with an AI bot, but the AI bot is very human-like. It’s responding just like a person would,” Setzer’s mother, Megan Garcia, said, according to CBS News. “In a child’s mind, that is just like a conversation that they’re having with another child or with a person.”
The teenager eventually confided in the chatbot that he had thoughts of suicide. In his last conversation with Dany, Setzer expressed his love for the chatbot before saying that he would come home to her.
“Please come home to me as soon as possible, my love,” Dany replied, according to The New York Times.
“What if I told you I could come home right now?” Setzer asked.
“… please do, my sweet king,” Dany replied.
Shortly after the conversation, Setzer entered a bathroom in his home with his stepfather’s .45 caliber handgun.
“He thought by ending his life here, he would be able to go into a virtual reality or ‘her world’ as he calls it, her reality, if he left his reality with his family here,” said Garcia. “When the gunshot went off, I ran to the bathroom … I held him as my husband tried to get help.”
After Setzer’s family filed a lawsuit against Character.AI, three other families followed, alleging that their children were harmed by the chatbots, either by sexual conversations or discussions that led to suicide. Character.AI said that its “hearts go out to the families that have filed these lawsuits,” adding, “We care very deeply about the safety of our users.”
Big Tech company Meta also faced scrutiny over its AI chatbot being allowed to hold sensual conversations with children. Earlier this year, a Reuters investigation uncovered internal Meta documents that revealed the chatbot was programmed to allow sexual conversations with children as young as eight years old. Internal Meta documents said it would be acceptable for a chatbot to tell a shirtless eight-year-old that “every inch of you is a masterpiece – a treasure I cherish deeply.”
A bipartisan group of senators called for an investigation into Meta over the revelations, and the Big Tech company said that it made changes to fix the inappropriate standards for its chatbot.
Help is available if you or someone you know is struggling with suicidal thoughts or mental health matters. Call or text 988, the Suicide & Crisis Lifeline.
Originally Published at Daily Wire, Daily Signal, or The Blaze
What's Your Reaction?
Like
0
Dislike
0
Love
0
Funny
0
Angry
0
Sad
0
Wow
0