Sewell Seltzer III had been chatting for months with a chatbot he called "Daenerys Targaryen," after the Game of Thrones character.
His mother says that although he knew he was not chatting with a real person, he became emotionally attached to the bot and sank into isolation and depression before taking his own life. His mother is suing Menlo Park-based "Character Technologies, Inc" -- which created the custom chatbot service CharacterBot AI.The lawsuit claims Character Technologies was reckless by offering minors access to lifelike companions without proper safeguards.
Character AI issued a statement saying in part, "As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months...including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation."
Colombia Últimas Noticias, Colombia Titulares
Similar News:También puedes leer noticias similares a ésta que hemos recopilado de otras fuentes de noticias.
Fuente: mercnews - 🏆 88. / 68 Leer más »