Systems Are Unfair
"It's not fair!", cries the child and the child is right. No matter how well we design a system, that system will be unfair. We will have systemic unfairness no matter what we do including systemic racism, systemic sexism, and systemic agism. The important thing is finding ways to mitigate the unfairness, offer ways to go around the system, and allow for mercy in judgement. The worst thing we can do is to make the system absolute and require "mandatory sentences", require only test scores, or require skin colors. Any "unfairness" is also an opportunity for the savvy entrepreneur.
Systems are not fair, and it is impossible to make a system totally fair. Any attempt to make a system fair by one criterion will naturally be unfair by other criteria.
When looking at the concepts of "fairness", it is possible to show that fairness is impossible as there are conflicting concepts that are under fairness. As human beings, we have multiple criteria as to what constitutes "fairness" and these criteria conflict with each other.
It is also possible to split up people into groups to show that a system that is "unfair" under some criteria is "fair" under other criteria. This is how we get political maps that some people consider fair by their measures but other people consider highly unfair.
A system that is "fair" for a small group of people can be massively unfair when scaled up to a large group. Systems that were fair for a rural community are unfair when applied to the nation as a whole. Any attempt to rectify past unfairness will be unfair to others.
In short, no system will ever be "fair".
As we build AI systems and start using them more and more in many different parts of our society, this lack of fairness and their errors become very important. Courts are using systems to determine how to punish someone. Police are using systems to figure out where to increase patrols. Hospitals are looking into systems to help judge who to give more intense treatment to and who to withhold treatment from. It is vital that we don't make these systems into absolute judges. Making mandatory sentences based on these systems is massively unfair. We need humans making these decisions, not computer systems.
Every unfair situation offers opportunities. Places where major restaurant chains won't go because of the dominant skin color can be opportunities to start a new restaurant. People who are being denied housing or employment because of their past actions can be opportunities to pick up quality employees or to provide housing. In Dallas, a shopping center that was declared "dead" is showing new life because there are enough people around it with money even though they don't look like those in the rich part of town.
We can't make a totally fair system. But we can work to change the systems to be more fair. Both our businesses and communities will benefit from our efforts to make things more fair.