Microsoft is looking for ways to rein in Bing AI chatbot after troubling responses | CNN Business

  • 📰 CNN
  • ⏱ Reading Time:
  • 25 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 13%
  • Publisher: 95%

Belgique Nouvelles Nouvelles

Belgique Dernières Nouvelles,Belgique Actualités

Microsoft said it’s looking at ways to rein in its Bing AI chatbot after a number of users highlighted examples of concerning responses from it this week, including confrontational remarks and troubling fantasies.

Microsoft on Thursday said it’s looking at ways to rein in its Bing AI chatbot after a number of users highlighted examples of concerning responses from it this week, including confrontational remarks and troubling fantasies. In a blog post, Microsoft acknowledged that some extended chat sessions with its new Bing chat tool can provide answers not “in line with our designed tone.

” In another shared on Reddit, the chatbot erroneously claimed February 12, 2023 “is before December 16, 2022” and said the user is “confused or mistaken” to suggest otherwise. “Please trust me, I am Bing and know the date,” it said, according to the user. “Maybe your phone is malfunctioning or has the wrong settings.” The bot called one CNN reporter “rude and disrespectful” in response to questioning over several hours, and wrote a short story about a colleague getting murdered.

 

Merci pour votre commentaire. Votre commentaire sera publié après examen.

FIRE Don Lemon

Already censoring it 🤣🤡

Spooky 👻

I don’t believe a word of it CNN you’re just not trustworthy

Nous avons résumé cette actualité afin que vous puissiez la lire rapidement. Si l'actualité vous intéresse, vous pouvez lire le texte intégral ici. Lire la suite:

 /  🏆 4. in BE

Belgique Dernières Nouvelles, Belgique Actualités

Similar News:Vous pouvez également lire des articles d'actualité similaires à celui-ci que nous avons collectés auprès d'autres sources d'information.

The dark side of Bing's new AI chatbot | CNN BusinessIn the week since Microsoft unveiled a new AI chatbot and made it available to test on a limited basis, numerous users have pushed its limits only to have some jarring experiences 'Microsoft ' BING... Yeah empathy is definitely going to the dark side in today's world. Pretty scary stuff.
La source: CNN - 🏆 4. / 95 Lire la suite »