Why Tech Companies Keep Making Racist Mistakes With AI

  • 📰 Gizmodo
  • ⏱ Reading Time:
  • 77 sec. here
  • 3 min. at publisher
  • 📊 Quality Score:
  • News: 34%
  • Publisher: 51%

Sverige Nyheter Nyheter

Sverige Senaste nytt,Sverige Rubriker

I Created a Biased AI Algorithm 25 Years Ago—Tech Companies Are Still Making the Same Mistake.

had shown that skin-colored areas of an image could be extracted in real time. So we decided to focus on skin color as an additional cue for the tracker.I used a digital camera – still a rarity at that time – to take a few shots of my own hand and face, and I also snapped the hands and faces of two or three other people who happened to be in the building. It was easy to manually extract some of the skin-colored pixels from these images and construct a statistical model for the skin colors.

In the age of AI, that knapsack needs some new items, such as “AI systems won’t give poor results because of my race.” The invisible knapsack of a white scientist would also need: “I can develop an AI system based on my own appearance, and know it will work well for most of my users.”One suggested remedy for white privilege is to be actively. For the 1998 head-tracking system, it might seem obvious that the anti-racist remedy is to treat all skin colors equally.

Scientists also face a nasty subconscious dilemma when incorporating diversity into machine learning models: Diverse, inclusive models perform worse than narrow models.A simple analogy can explain this. Imagine you are given a choice between two tasks. Task A is to identify one particular type of tree – say, elm trees. Task B is to identify five types of trees: elm, ash, locust, beech and walnut.

In the same way, an algorithm that tracks only white skin will be more accurate than an algorithm that tracks the full range of human skin colors. Even if they are aware of the need for diversity and fairness, scientists can be subconsciously affected by this competing need for accuracy.My creation of a biased algorithm was thoughtless and potentially offensive. Even more concerning, this incident demonstrates how bias can remain concealed deep within an AI system.

The good news is that a great deal of progress on AI fairness has already been made, both in academia and in industry. Microsoft, for example, has a research group known as

 

Tack för din kommentar. Din kommentar kommer att publiceras efter att ha granskats.
Vi har sammanfattat den här nyheten så att du kan läsa den snabbt. Om du är intresserad av nyheterna kan du läsa hela texten här. Läs mer:

 /  🏆 556. in SE

Sverige Senaste nytt, Sverige Rubriker

Similar News:Du kan också läsa nyheter som liknar den här som vi har samlat in från andra nyhetskällor.

EU Crypto Tax Plans Include NFTs, Foreign Companies, Draft Text ShowsLaws set to be agreed next week would require crypto companies to register with tax authorities, even if they’re based outside the bloc or offering non-fungible tokens.
Källa: CoinDesk - 🏆 291. / 63 Läs mer »

These companies will pay their employees to work out'They're getting compensated whatever their pay rate is to be there,' one CEO said of his company's fitness classes.
Källa: wrtv - 🏆 598. / 51 Läs mer »

Just Capital reveals top companies for MomsAccess to child care and paid leave are among the top obstacles working women face, according to Just Capital polling. JBoorstin reports.
Källa: CNBC - 🏆 12. / 72 Läs mer »

Victims’ Families Sue Social Media Companies Over Buffalo MassacreThe lawsuit alleges that several tech giants fueled gunman Payton Gendron with “racist, antisemitic, and white supremacist propaganda.”
Källa: thedailybeast - 🏆 307. / 63 Läs mer »

Loved ones sue social media companies over Buffalo massacreBUFFALO, N.Y. — Loved ones of those killed in the 2022 Buffalo grocery store mass shooting filed a wrongful death lawsuit Friday against a number of social media companies alleging they facilitated the teenage killer's white supremacist radicalization by allowing racist propaganda to fester on their platforms.
Källa: WOKVNews - 🏆 247. / 63 Läs mer »