Sewell Seltzer III had been chatting for months with a chatbot he called "Daenerys Targaryen," after the Game of Thrones character.
His mother says that although he knew he was not chatting with a real person, he became emotionally attached to the bot and sank into isolation and depression before taking his own life. His mother is suing Menlo Park-based "Character Technologies, Inc" -- which created the custom chatbot service CharacterBot AI.The lawsuit claims Character Technologies was reckless by offering minors access to lifelike companions without proper safeguards.
Character AI issued a statement saying in part, "As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months...including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation."
इंडिया ताज़ा खबर, इंडिया मुख्य बातें
Similar News:आप इससे मिलती-जुलती खबरें भी पढ़ सकते हैं जिन्हें हमने अन्य समाचार स्रोतों से एकत्र किया है।
स्रोत: mercnews - 🏆 88. / 68 और पढो »