A Google AI tool that can recognize objects in pictures will no longer attach gender labels like "woman" or "man" to images of people.
Google's Cloud Vision API is a service for developers that allows them to, among other things, attach labels to photos identifying the contents. Google said it had made the change because it was not possible to infer someone's gender solely from their appearance. It also cited its own ethical rules on AI, stating that gendering photos could exacerbate unfair bias.
"Classifying people as male or female assumes that gender is binary. Anyone who doesn't fit it will automatically be misclassified and misgendered. So this is about more than just bias — a person's gender cannot be inferred by appearance. Any AI system that tried to do that will inevitably misgender people."
I have a better idea, call everyone IT, castrate the males and dress everyone in skirts to avoid bias !! Really there is no shame in having a distinct heterosexual gender indentity
This itself makes it biased.
Have a day off
stupid
In this case AI has been already smarter than a human.
Who might benefit from that exactly? Should we probably re-write the Genesis?
Making AI more stupid. Genius
Unacceptable. Women and men are naturally biased concepts and each plays a determining role in humanity's evolution. Mixing things up will not help neither they AI nor humans in any case.
I suppose that's progressive, interesting article.
They are coming at google like they did Bill Gates (Clinton was prez when that happened).
Hey sundarpichai aiming for singularity in 60 years are you ..
so 90+% of humans are supposed to just lose their identity? This extremism will be seen as a sign of the times when America sunk itself to the bottom of the ocean. This extremism is a sign of decay.
People are stupid
United Kingdom United Kingdom Latest News, United Kingdom United Kingdom Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: BusinessInsider - 🏆 729. / 51 Read more »
Source: CNBC - 🏆 12. / 72 Read more »
Source: CNBC - 🏆 12. / 72 Read more »