Sewell Seltzer III had been chatting for months with a chatbot he called "Daenerys Targaryen," after the Game of Thrones character.
His mother says that although he knew he was not chatting with a real person, he became emotionally attached to the bot and sank into isolation and depression before taking his own life. His mother is suing Menlo Park-based "Character Technologies, Inc" -- which created the custom chatbot service CharacterBot AI.The lawsuit claims Character Technologies was reckless by offering minors access to lifelike companions without proper safeguards.
Character AI issued a statement saying in part, "As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months...including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation."
Belgique Dernières Nouvelles, Belgique Actualités
Similar News:Vous pouvez également lire des articles d'actualité similaires à celui-ci que nous avons collectés auprès d'autres sources d'information.
La source: mercnews - 🏆 88. / 68 Lire la suite »