If we feed our history to computers we are condemned to repeat it
AI bias is an issue I heard about only a couple of days ago. There’s a risk that thinking machines devised by humans will inherit their prejudices as well.
Some people are already soft victims of such algorithms (e.g. credit checks). So before we worrying about the looming existential threat that machines smarter than us all would inflict the human race, and before asking computers to solve the trolley problem, people should make sure that data used to teach the machines is the right one. It is not an easy issue; Google photo recognition software classified African people as Gorillas, Nikon cameras “thought” Asians were blinking.
AI is already used to asses risk of recidivism, in medical software or worse in military robots. If we fed data from history on what makes a great president for a country, any machine would for sure exclude women, young people, immigrants, or anything that’s not male and white.