Florida mother sues AI company over allegedly causing death of teen son

  • 📰 FoxBusiness
  • ⏱ Reading Time:
  • 39 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 19%
  • Publisher: 53%

대한민국 뉴스 뉴스

대한민국 최근 뉴스,대한민국 헤드 라인

A Florida mother filed a lawsuit Tuesday claiming the artificial intelligence company Character.AI allegedly caused the suicide of her 14-year-old son.

This story discusses suicide. If you or someone you know is having thoughts of suicide, please contact the Suicide & Crisis Lifeline at 988 or 1-800-273-TALK . A Florida mother is suing the artificial intelligence company Character.AI for allegedly causing the suicide of her 14-year-old son. The mother filed a lawsuit against the company claiming her son was addicted to the company’s service and the chatbot created by it. Megan Garcia says Character.

Garcia claims in the lawsuit that the chatbot "misrepresented itself as a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell's desire to no longer live outside" of the world created by the service. The lawsuit also said he became "noticeably withdrawn, spent more and more time alone in his bedroom, and began suffering from low self-esteem.

Our investigation confirmed that, in a number of instances, the user rewrote the responses of the Character to make them explicit. In short, the most sexually graphic responses were not originated by the Character, and were instead written by the user," Jerry Ruoti, head of trust & safety at Character.AI told CBS News. Moving forward, Character.

 

귀하의 의견에 감사드립니다. 귀하의 의견은 검토 후 게시됩니다.
이 소식을 빠르게 읽을 수 있도록 요약했습니다. 뉴스에 관심이 있으시면 여기에서 전문을 읽으실 수 있습니다. 더 많은 것을 읽으십시오:

 /  🏆 458. in KR

대한민국 최근 뉴스, 대한민국 헤드 라인