Sewell Seltzer III had been chatting for months with a chatbot he called "Daenerys Targaryen," after the Game of Thrones character.
His mother says that although he knew he was not chatting with a real person, he became emotionally attached to the bot and sank into isolation and depression before taking his own life. His mother is suing Menlo Park-based "Character Technologies, Inc" -- which created the custom chatbot service CharacterBot AI.The lawsuit claims Character Technologies was reckless by offering minors access to lifelike companions without proper safeguards.
Character AI issued a statement saying in part, "As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months...including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation."
대한민국 최근 뉴스, 대한민국 헤드 라인
Similar News:다른 뉴스 소스에서 수집한 이와 유사한 뉴스 기사를 읽을 수도 있습니다.
출처: mercnews - 🏆 88. / 68 더 많은 것을 읽으십시오 »