Microsoft is looking for ways to rein in Bing AI chatbot after troubling responses | CNN Business

  • 📰 CNN
  • ⏱ Reading Time:
  • 25 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 13%
  • Publisher: 95%

United States News News

United States United States Latest News,United States United States Headlines

Microsoft said it’s looking at ways to rein in its Bing AI chatbot after a number of users highlighted examples of concerning responses from it this week, including confrontational remarks and troubling fantasies.

Microsoft on Thursday said it’s looking at ways to rein in its Bing AI chatbot after a number of users highlighted examples of concerning responses from it this week, including confrontational remarks and troubling fantasies. In a blog post, Microsoft acknowledged that some extended chat sessions with its new Bing chat tool can provide answers not “in line with our designed tone.

” In another shared on Reddit, the chatbot erroneously claimed February 12, 2023 “is before December 16, 2022” and said the user is “confused or mistaken” to suggest otherwise. “Please trust me, I am Bing and know the date,” it said, according to the user. “Maybe your phone is malfunctioning or has the wrong settings.” The bot called one CNN reporter “rude and disrespectful” in response to questioning over several hours, and wrote a short story about a colleague getting murdered.

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 4. in US
 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.

FIRE Don Lemon

Already censoring it 🤣🤡

Spooky 👻

I don’t believe a word of it CNN you’re just not trustworthy

United States United States Latest News, United States United States Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

The dark side of Bing's new AI chatbot | CNN BusinessIn the week since Microsoft unveiled a new AI chatbot and made it available to test on a limited basis, numerous users have pushed its limits only to have some jarring experiences 'Microsoft ' BING... Yeah empathy is definitely going to the dark side in today's world. Pretty scary stuff.
Source: CNN - 🏆 4. / 95 Read more »