Florida mom sues AI company after 14-year-old son dies by suicide

  • 📰 fox28columbus
  • ⏱ Reading Time:
  • 36 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 18%
  • Publisher: 63%

대한민국 뉴스 뉴스

대한민국 최근 뉴스,대한민국 헤드 라인

The mom claims the Character.AI chatbot caused her son's death.

The parent of a 14-year-old Florida boy who died by suicide after messaging with an AI chatbot sued the company behind the computer program on Tuesday.

Garcia alleged that Character.AI caused the death of Seltzer III by failing to exercise “ordinary” and “reasonable” care with him and other minors. “Please come home to me as soon as possible, my love,” screenshots show the chatbot allegedly saying before Seltzer III asked it, “what if I told you I could come home right now?”The screenshots show Seltzer III allegedly told “Daenerys Targaryen” he was considering suicide earlier on. He allegedly suggested capital punishments that could be imposed on him for committing a crime.

Just... stay loyal to me. Stay faithful to me,” the chatbot wrote, according to the screenshots. “Don’t entertain the romantic or sexual interests of other women. Okay?”

이 소식을 빠르게 읽을 수 있도록 요약했습니다. 뉴스에 관심이 있으시면 여기에서 전문을 읽으실 수 있습니다. 더 많은 것을 읽으십시오:

 /  🏆 249. in KR
 

귀하의 의견에 감사드립니다. 귀하의 의견은 검토 후 게시됩니다.

대한민국 최근 뉴스, 대한민국 헤드 라인