Why Tech Companies Keep Making Racist Mistakes With AI

  • 📰 Gizmodo
  • ⏱ Reading Time:
  • 77 sec. here
  • 3 min. at publisher
  • 📊 Quality Score:
  • News: 34%
  • Publisher: 51%

الإمارات العربية المتحدة أخبار أخبار

الإمارات العربية المتحدة أحدث الأخبار,الإمارات العربية المتحدة عناوين

I Created a Biased AI Algorithm 25 Years Ago—Tech Companies Are Still Making the Same Mistake.

had shown that skin-colored areas of an image could be extracted in real time. So we decided to focus on skin color as an additional cue for the tracker.I used a digital camera – still a rarity at that time – to take a few shots of my own hand and face, and I also snapped the hands and faces of two or three other people who happened to be in the building. It was easy to manually extract some of the skin-colored pixels from these images and construct a statistical model for the skin colors.

In the age of AI, that knapsack needs some new items, such as “AI systems won’t give poor results because of my race.” The invisible knapsack of a white scientist would also need: “I can develop an AI system based on my own appearance, and know it will work well for most of my users.”One suggested remedy for white privilege is to be actively. For the 1998 head-tracking system, it might seem obvious that the anti-racist remedy is to treat all skin colors equally.

Scientists also face a nasty subconscious dilemma when incorporating diversity into machine learning models: Diverse, inclusive models perform worse than narrow models.A simple analogy can explain this. Imagine you are given a choice between two tasks. Task A is to identify one particular type of tree – say, elm trees. Task B is to identify five types of trees: elm, ash, locust, beech and walnut.

In the same way, an algorithm that tracks only white skin will be more accurate than an algorithm that tracks the full range of human skin colors. Even if they are aware of the need for diversity and fairness, scientists can be subconsciously affected by this competing need for accuracy.My creation of a biased algorithm was thoughtless and potentially offensive. Even more concerning, this incident demonstrates how bias can remain concealed deep within an AI system.

The good news is that a great deal of progress on AI fairness has already been made, both in academia and in industry. Microsoft, for example, has a research group known as

 

شكرًا لك على تعليقك. سيتم نشر تعليقك بعد مراجعته.
لقد قمنا بتلخيص هذا الخبر حتى تتمكن من قراءته بسرعة. إذا كنت مهتمًا بالأخبار، يمكنك قراءة النص الكامل هنا. اقرأ أكثر:

 /  🏆 556. in AE

الإمارات العربية المتحدة أحدث الأخبار, الإمارات العربية المتحدة عناوين

Similar News:يمكنك أيضًا قراءة قصص إخبارية مشابهة لهذه التي قمنا بجمعها من مصادر إخبارية أخرى.

EU Crypto Tax Plans Include NFTs, Foreign Companies, Draft Text ShowsLaws set to be agreed next week would require crypto companies to register with tax authorities, even if they’re based outside the bloc or offering non-fungible tokens.
مصدر: CoinDesk - 🏆 291. / 63 اقرأ أكثر »

These companies will pay their employees to work out'They're getting compensated whatever their pay rate is to be there,' one CEO said of his company's fitness classes.
مصدر: wrtv - 🏆 598. / 51 اقرأ أكثر »

Just Capital reveals top companies for MomsAccess to child care and paid leave are among the top obstacles working women face, according to Just Capital polling. JBoorstin reports.
مصدر: CNBC - 🏆 12. / 72 اقرأ أكثر »

Victims’ Families Sue Social Media Companies Over Buffalo MassacreThe lawsuit alleges that several tech giants fueled gunman Payton Gendron with “racist, antisemitic, and white supremacist propaganda.”
مصدر: thedailybeast - 🏆 307. / 63 اقرأ أكثر »

Loved ones sue social media companies over Buffalo massacreBUFFALO, N.Y. — Loved ones of those killed in the 2022 Buffalo grocery store mass shooting filed a wrongful death lawsuit Friday against a number of social media companies alleging they facilitated the teenage killer's white supremacist radicalization by allowing racist propaganda to fester on their platforms.
مصدر: WOKVNews - 🏆 247. / 63 اقرأ أكثر »