After asking Microsoft’s AI-powered Bing chatbot for help in coming up with activities for my kids while juggling work, the tool started by offering something unexpected: empathy. The chatbot said it “must be hard” to balance work and family and sympathized for my daily struggles with it.
” “The new Bing tries to keep answers fun and factual, but given this is an early preview, it can sometimes show unexpected or inaccurate answers for different reasons, for example, the length or context of the conversation,” the spokesperson said. “As we continue to learn from these interactions, we are adjusting its responses to create coherent, relevant and positive answers.
Pretty scary stuff.
Yeah empathy is definitely going to the dark side in today's world.
'Microsoft ' BING...