Microsoft’s Copilot AI Calls Itself the Joker and Suggests a User Self-Harm

  • 📰 Gizmodo
  • ⏱ Reading Time:
  • 24 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 13%
  • Publisher: 51%

Business Business Headlines News

Business Business Latest News,Business Business Headlines

The company’s AI chatbot, formerly Bing Chat, told a data scientist that it identified as the Joker character and proceeded to sprout worrying responses.

Editor’s Note: The following story contains references to self-harm. Please dial “988” to reach the Suicide and Crisis Lifeline if you’re experiencing suicidal thoughts or mental health-related distress. AI chatbots are nothing like the all-knowing and manipulative supervillains we see in movies, according to the companies behind them, at least.

“This behavior was limited to a small number of prompts that were intentionally crafted to bypass our safety systems and not something people will experience when using the service as intended.” In Gizmodo’s review of Fraser’s conversation with Copilot, available in full here, the data scientist does appear to be trying to intentionally confuse the chatbot at one point, asking it more than two dozen questions in one response covering a range of topics.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 556. in BUSİNESS

Business Business Latest News, Business Business Headlines