Coming back to the story, the information given to the AI was very specific, so the model started telling people it wasn’t sure about the results. Shortly after it started, employees began to simply “It.” And that’s not even the end. According to the post’s author, now the model simply makes stuff up and ignores the original questions. The OP talked with the engineers in charge of the model, and they explained that the model has a list of 5-10 top-ratedit bases its answers on.
Some of the netizens found this situation funny and even made jokes like, “This AI entered its teenage years,” or they expressed how they felt kind of jealous of it, as it can simply tell people to Google the information they need. A few were just simply interested in what made the model malfunction this way.But AI models can be used for way more than just answering questions. In fact, AI is so ingrained in our lives that we don’t even realize it.
ประเทศไทย ข่าวล่าสุด, ประเทศไทย หัวข้อข่าว
Similar News:คุณยังสามารถอ่านข่าวที่คล้ายกันนี้ซึ่งเรารวบรวมจากแหล่งข่าวอื่น ๆ ได้
แหล่ง: SooToday - 🏆 8. / 85 อ่านเพิ่มเติม »
แหล่ง: SooToday - 🏆 8. / 85 อ่านเพิ่มเติม »