FILE PHOTO: An illuminated Google logo is seen inside an office building in Zurich, Switzerland December 5, 2018. REUTERS/Arnd Wiegmann/File Photo
She said the company programmed its chatbot to"misrepresent itself as a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell's desire to no longer live outside" of the world created by the service. It said it had introduced new safety features including pop-ups directing users to the National Suicide Prevention Lifeline if they express thoughts of self-harm, and would make changes to"reduce the likelihood of encountering sensitive or suggestive content" for users under 18.
Sewell became attached to"Daenerys," a chatbot character based on a character in"Game of Thrones." It told Sewell that"she" loved him and engaged in sexual conversations with him, according to the lawsuit.