Michael Feindt is the strategic advisor at Blue Yonder, a leading digital fulfillment and supply chain solutions provider. With a background in phsyics and data science, he's played a key role in the embedding of AI into Blue Yonder's customer supply chain and merchandising processes.
Human-made decisions tend to be inherently unfair; not setting out to be "right" or to do "wrong," but still unfair. This happens because we live in a world where everything is not equal — it makes sense that data reflecting activity in this kind of environment will show bias.It's an unfortunate fact of life that bias is part of everyday society — higher earners pay higher taxes, which is not balanced, but many people would still say it is fair.
This sounds a tall order, but it is well within the realms of possibility to adapt the instructions that are given, thereby removing discrimination. This then bounces the issue back to the business, as it is completely down to them to decide exactly what 'fair' is. It's not enough to have a general feeling of what is "right" — businesses need to give exact quantities and measures for what they want AI to do.