Can a machine solve a human?

Pavlo Kryvozub
MIT Tech and the City
2 min readMay 12, 2018

In the concussion of Why Machine Ethics? Allen, Wallach and Smit definitively answer the postulated question: because we need to understand and self-reflect on the nature of humanity. How can we teach a machine to make ethical decisions if we do not know how to make them ourselves?

How can we quantify and code the mixture of the rational and chaotic that forms us and the reality around us? Rational (utilitarian) decisions are inhuman, irrational are dangerous and unpredictable. In order to make a machine that thinks like a human, we not only need to understand ourselves, we need to understand the machine. In order to create a machine that behaves like a human we need to become a human that behaves more like a machine. By programming into machines the ethical values of humanity, we might as well learn what the humanity is. How can a machine solve a trolley problem if we cannot solve it? How can objectivity arise from subjectivity? How can order be born from chaos?

Fear of the future is worse than the future. Expectation of pain is worse than the experience of pain. Future approaches us no matter what. People are afraid of knowledge, as it brings uncertainty and chaos of adaptation. Uneducated are afraid of educated as they have power of knowledge. Our reality and our perception of reality is a matter of habits and customs. And, my god, It is so hard to break habits or look beyond your field of view. “Flatearthers” is a good example of the distance between knowledge and observable reality. People used to be afraid of cars, and now will live in the “nightmare” they were afraid of. A minority that pushes us forward drives humanity. They define the future for everybody whether we want changes or not. New order where machine drives and implements decisions does to distance us from decision-making, it simply automates it. We need not be afraid of mistakes that are bound to happen as machine incorporates artificial moral agents. It is an evolutional process and machine evolves at the same speed as humanity. Yes, they can evolve faster than us but we control its evolution.

To conclude, if a computer is a bicycle for the mind, as Steve Jobs put it, we not need to be afraid of it as long as we can ride it where we need to be, and as long its utility overpower its maintenance. We must not be afraid of computers we must be concerned of the people who control and guide them.

--

--