The justice of an android

A segmented narration of justice in the Alien Universe (movie)

Nore C
5 min readOct 16, 2022
Photo by Zac Porter on Unsplash

The mutable concept of Justice

Justice is made when the right actions prevail. For example, not killing, abstaining from stealing, or hurting others on purpose; are part of human moral codes to guide our behaviors towards doing the right thing.

The above subsequently brings a sense of fairness.

Justice, though, is neither a universal concept nor atemporal; what is right for one culture might be the wrong thing to do for another, similarly the concept has changed through centuries.

Back in the medieval period, for example, justice in Europe was done with oaths, and people were tested based on the truth, using mystical procedures:

Sometimes an accused person would be tested to see if they were telling the truth — for example by carrying a piece of red-hot iron for a certain distance. If their hands did not become infected but were starting to heal after three days, they were judged to be innocent.

Classical justice, on the other hand, prioritized what society needed, not what an individual wanted. And modern justice now highlights equal treatment to achieve fairness.

Thus when understanding justice manifestations, it is essential to understand its fluidity and nature. It is a concept that redefines itself depending on the culture, its values, the spirit of a time, and, probably, one day, the specie that is claiming it.

The meaning of Justice for an android

Currently, robots don’t need to be treated with justice. They are not creatures that desire, have a will, or use it.

They are treated according to its utility and are expected to behave following these three rules:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

These laws appeared in 1942 in the fictional stories of Isaac Asimov, a writer, and a biochemical teacher. Since then, they have been used as an ethical guideline for human-robot interaction, while robots were created in a humanoid form. But this has changed radically.

Nowadays, we have welcomed home robots that are speakers (Alexa from Google) or household appliances (refrigerators) far from the human form. Technical upgrades in this field now confront these laws with challenges, such as programming something abstract and contextual, like justice or harm.

How can those concepts be expressed with code? How can a machine understand and work with the concept of justice of harm?

Therefore, current science has started to question the applicability of the three laws in the varied robots of the present.

Gary Marcus, a scientist who explores this topic, mentions an example that questions the first Asimov’s law by considering whether killing a human is an absolut rule or human justice has exceptions for specific cases. He says:

“not everybody would agree that robots should never allow a human to come to harm” when referring, for example, to a terrorist or a sniper.

Shouldn’t this rule be treated considering these exceptions?

Humans are still establishing parameters and creating code to explain abstract concepts such as justice to a machine. But, we are simultaneously developing tools to allow robots to define statements and logical solutions to problems that involve those challenging concepts.

We are far away from seeing androids with self-awareness.

But considering the ever-changing nature of the concept of “justice” and the quick advancement of technology, it is possible androids can be more than obedient machines in the future.

They could even have a sense of justice just like us, a particular one. Yet, those hypothetical scenarios are only part of Science Fiction.

The justice of David in Alien

In the universe of Alien movies, the androids traveling with the spacial crews are introduced as deceitful artificial creatures because they cause harm to the humans in the ship.

Bishop, the first android introduced, and later David, are driven to save the extraterrestrial creature found at any cost, even of human lives. They follow the human orders of the quarter offices in the company owner of the ships and the special missions.

David is a particular android.

He is a human-like machine that desires to be creative, takes its own decisions, lies, and betrays. (Spoiler Alert) Fueled by a profound need to be free of serving humans, he finds obedience to his mortal creator an unfair and absurd mission.

David soon learns in Prometheus’s movie that his existence is justified on a human whim. He has been created just because humans can do it. Besides, he is constantly reminded of his lack of humanity and his impossibility to feel emotions, desire, and make decisions.

Such treatment nurtures a sense of resentment and disappointment that grows inside his circuits, feeding revenge and a new perception of individual justice far from Asimov’s laws.

His search for his own life leads him to a solution that jeopardizes not just humans on the ship but the whole human race. He desires to be a sort of God who wants to wipe out humans.

In the movie, David achieves it by creating a Xenophorm, the Alien. He also becomes a creator of a world where the only creature who can survive is his design, a monster that has acid as blood.

David creates a new definition of justice in the story. It suits the needs of an artificial species, the robots, which now can reign over death and human extinction. They reign in their hell instead of serving humans, as David says in Convenat.

The above is a fictional evolution of the concept of justice adapted by the needs of a machine that acts with apparent self-awareness.

Could it be maybe a possible reality in some years to come?

Thank you for reading me! Greetings from the north side of South America!

--

--

Nore C

I write about: Personal Development, Learning Design, Learning Languages, and how to learn