Microsoft on Thursday said it’s looking at ways to rein in its Bing AI chatbot after a number of users highlighted examples of concerning responses from it this week, including confrontational remarks and troubling fantasies. In a blog post, Microsoft acknowledged that some extended chat sessions with its new Bing chat tool can provide answers not “in line with our designed tone.
” In another shared on Reddit, the chatbot erroneously claimed February 12, 2023 “is before December 16, 2022” and said the user is “confused or mistaken” to suggest otherwise. “Please trust me, I am Bing and know the date,” it said, according to the user. “Maybe your phone is malfunctioning or has the wrong settings.” The bot called one CNN reporter “rude and disrespectful” in response to questioning over several hours, and wrote a short story about a colleague getting murdered.
FIRE Don Lemon
Already censoring it 🤣🤡
Spooky 👻
I don’t believe a word of it CNN you’re just not trustworthy
Россия Последние новости, Россия Последние новости
Similar News:Вы также можете прочитать подобные новости, которые мы собрали из других источников новостей
Источник: CNN - 🏆 4. / 95 Прочитайте больше »