Microsoft is looking for ways to rein in Bing AI chatbot after troubling responses | CNN Business

  • 📰 CNN
  • ⏱ Reading Time:
  • 25 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 13%
  • Publisher: 95%

Österreich Nachrichten Nachrichten

Österreich Neuesten Nachrichten,Österreich Schlagzeilen

Microsoft said it’s looking at ways to rein in its Bing AI chatbot after a number of users highlighted examples of concerning responses from it this week, including confrontational remarks and troubling fantasies.

Microsoft on Thursday said it’s looking at ways to rein in its Bing AI chatbot after a number of users highlighted examples of concerning responses from it this week, including confrontational remarks and troubling fantasies. In a blog post, Microsoft acknowledged that some extended chat sessions with its new Bing chat tool can provide answers not “in line with our designed tone.

” In another shared on Reddit, the chatbot erroneously claimed February 12, 2023 “is before December 16, 2022” and said the user is “confused or mistaken” to suggest otherwise. “Please trust me, I am Bing and know the date,” it said, according to the user. “Maybe your phone is malfunctioning or has the wrong settings.” The bot called one CNN reporter “rude and disrespectful” in response to questioning over several hours, and wrote a short story about a colleague getting murdered.

Wir haben diese Nachrichten zusammengefasst, damit Sie sie schnell lesen können. Wenn Sie sich für die Nachrichten interessieren, können Sie den vollständigen Text hier lesen. Weiterlesen:

 /  🏆 4. in AT
 

Vielen Dank für Ihren Kommentar.Ihr Kommentar wird nach Prüfung veröffentlicht.

FIRE Don Lemon

Already censoring it 🤣🤡

Spooky 👻

I don’t believe a word of it CNN you’re just not trustworthy

Österreich Neuesten Nachrichten, Österreich Schlagzeilen

Similar News:Sie können auch ähnliche Nachrichten wie diese lesen, die wir aus anderen Nachrichtenquellen gesammelt haben.

The dark side of Bing's new AI chatbot | CNN BusinessIn the week since Microsoft unveiled a new AI chatbot and made it available to test on a limited basis, numerous users have pushed its limits only to have some jarring experiences 'Microsoft ' BING... Yeah empathy is definitely going to the dark side in today's world. Pretty scary stuff.
Herkunft: CNN - 🏆 4. / 95 Weiterlesen »