Star Trek: The Next Generation had a rough start. Its first couple seasons were a poor, uneven campy mess — such a poor start that Patrick Stewart, who was playing the lead of the series Captain Jean-Luc Picard, refused to unpack his bags figuring that the show would be cancelled. The first two seasons tried to explore deep philosophical and ethical issues but did it far too ham-fistedly (including two incredibly problematic episodes in the first season, the incredibly racist Code of Honor and the shockingly sexist and against the entire spirit of Star Trek, Angel One.)
That’s not to say that the series didn’t show flashes of its future brilliance in these first two seasons. Q, played by John de Lancie, was an omnipotent and transdimensional being with a vast hatred for the “inferior” human species and a penchant for tormenting Captain Picard, was one of the few highlights but overall it took a while for the show to truly find it’s footing.
It would not be until the second season that The Next Generation would finally reach it’s potential. This starts with The Measure of a Man, one of my absolute favorite episodes of the entire series and the exact moment of genius that saved the show. It pondered the ethical dilemma of whether or not artificial beings are property or not as well have equal rights to sentient living creatures.
The Measure of a Man begins with the USS Enterprise visiting a nearby Starbase for repairs. There, Picard runs into an old flame, Philippa Louvois, who was in the process of setting up a Judge Advocate General corp for that sector. Previously, she had attempted to prosecute Picard in a court martial for losing the USS Stargazer (his former ship before the Enterprise — as explained it was not personal, rather it was law that all commanding offers are court-martials if they lose their ship).
On the Starbase was cyberneticist Bruce Maddox, an ambitious scientist attempting to build androids in order to prevent the losses of life in dangerous situations — as androids in his mind are not “living” they would be used instead of human officers. He needed to analyze Data, an android officer on the Enterprise played by the incomparable Brent Spiner (and one of the only three successful androids built by a human, Dr. Noonian Soong). Data had what was called a “Positronic brain” which granted him artificial sentience and Maddox needed to analyze it as this was the final hurdle in his creation.
In order for him to do this he would need to deactivate and dissemble Data.
This is the first ethical question that The Measure of a Man brings up, essentially a 24th century version of “The Trolly Problem.” This thought experiment is taught in essentially every Ethics 101 class and is as follows: you are standing next to a lever that controls the direction of the tracks that an out of control trolly is careening down. Ahead on the tracks are a group of five construction workers who do not see the trolly coming and will be unable to get out of the way. They will be killed if nothing is done. On another track is one person who is oblivious to the trolly as well.
If you do not flip the switch the five construction workers will die. If you do, the one person will die and save the lives of the five workers. Which is the correct choice? Are the lives of five people intrinsically “worth” more than the one life you would end by flipping that switch?
Data’s personal sacrifice to develop this new army of androids would certainly safe countless lives. With all the dangerous encounters and battles across the frontier of space to be able to use androids rather than humans to “do the dirty work” would certainly prevent the deaths of many and push planetary exploration, to seek out new life and new civilizations, well beyond the limits of humanity.
Herein lies the problem: deontologists would believe that acting involves an active choice to kill — thereby creating a moral wrong by choosing to kill one as well as valuing five lives over one life is immoral as each life is equally valuable. Consequentialists would posit that five people dying is worse than one person dying, therefore you would be doing the most good by killing one person rather than the five workers.
So where does Data sit in this experiment? Is he a sentient being deserving of rights? On one hand, if Data is a “living” being he needs to be afforded all the rights of a normal Federation citizen. On the other if he is not sentient clearly there is no moral wrong — as Maddox explains, “You are imparting Human qualities to it because it looks Human — but I assure you: it is not. If it were a box on wheels I would not be facing this opposition.”
Maddox’s quote brings up an interesting point. One of the innate human tendencies is to project human characteristics on non-living entities, or anthropomorphism. We do this logically partly as a defense mechanism to explain abstract concepts we do not have the ability to process. It is impossible to objectively tell whether or not Data is alive or not because we do not understand the experience of not being alive but also having consciousness and self-awareness.
In the eyes of Starfleet Command Data was property rather than living and therefore would be transferred over to Maddox for analysis. Picard was livid and demanded a trial to dispute this decision — is Data a sentient being? Most importantly, does he have a right to choose whether or not he undergoes this disassembly and analysis?
Due to Starfleet JAG regulations, Commander William Riker (Jonathan Frakes) receives the unenviable task of having to prosecute Data and prove that he is not alive while Picard defends him. Riker’s ethical dilemma is too an interesting one: to what extent does duty supersede friendship? Data and Riker are good friends and poker buddies, yet if Riker refuses to prosecute him, Data will automatically be handed over and deactivated.
Riker begins his prosecution quickly and violently: he removes Data’s hand and deactivates him on the stand, finishing with “Pinocchio is broken; its strings have been cut.” While on the surface this is seemingly a compelling argument but incredibly flawed.
Picard states in Data’s defense:
Commander Riker has dramatically demonstrated to this court that Lieutenant Commander Data is a machine. Do we deny that? No, because it is not relevant: we, too, are machines, just machines of a different type. Commander Riker has also reminded us that Lieutenant Commander Data was created by a man; do we deny that? No. Again, it is not relevant. Children are created from the ‘building blocks’ of their parents’DNA. Are they property?”
Human beings can too be “deactivated” — put someone in a chokehold for long enough and they will pass out. Even we as living beings have an off switch.
Picard begins to discuss the metaphysical realities of Data’s existence, grilling Maddox on the stand, asking him to provide the criteria for sentience. Maddox replies with intelligence, self-awareness, and consciousness. Data obviously and undoubtedly is intelligent.
Is Data self-aware? Before Data left the Enterprise he assembled his assorted trinkets he had collected over the years. First, a box of medals he had acquired over his time on the Enterprise. Secondly, a book that Picard had given to him previously. Third, a holographic image of Tasha Yar, the former security chief of the Enterprise with whom Data had an intimate relationship with and was killed earlier.
This is not the behavior of a non-sentient being who would have no need for such trinkets and memories if he was not a conscious being — unlike humans, a perfect memory would keep them all accessible without any object needed to refresh the mind.
The question becomes now is Data self aware? He is brilliant but a child, a sentient walking Wikipedia who does not understand why he does what he does but does it anyway — again, an inherently human trait.
Is Data conscious? He attempts as best his can to understand his place in the universe — his search for meaning quite Jungian; he searches for meaning through the fundamental archetypes that animate life. He listens to and makes music. He takes care of his cat Spot. He talks to a holographic program in an attempt to learn how to be funny. He has sex and (poorly) pursues relationships. He desires to create new life (his “daughter” Lal).
Most importantly, he desires a legacy, something to leave behind for others to see. He comments in his defense:
I am the culmination of one man’s dream. This is not ego or vanity, but when Doctor Soong created me, he added to the substance of the universe. If, by your experiments, I am destroyed, something unique — something wonderful — will be lost. I cannot permit that. I must protect his dream.
Clearly, a non-sentient being would not comprehend humanity’s desire and drive to create a legacy that will live on long after our physical deaths. This is not the behavior of a non-conscious being.
The final ethical dilemma we need to consider about Data’s sentience is the issue of slavery. As Guinan, the Ten Forward bartender played by Whoopi Goldberg put it to Picard simply:
Consider that in the history of many worlds there have always been disposable creatures. They do the dirty work. They do the work that no one else wants to do, because it’s too difficult or too hazardous. And an army of Datas, all disposable? You don’t have to think about their welfare; you don’t think about how they feel. Whole generations of disposable people.
If we view Data as properly we would not feel badly for sending him to his death — for the first time the deontologist and consequentialist come together to agree. That being said, abusing artificial intelligence in order to make our lives easier, to remove their ability to choose, to destroy any sense of self-preservation and self-worth, is an egregious violation of civil rights.
As Picard says,
Your honor, the courtroom is a crucible. In it we burn away irrelevancies until we are left with a pure product, the truth. For all time. Now sooner or later, this man, or others like him will succeed in replicating Commander Data. And the decision you reach here today will determine how we will regard this creation of our genius. It will reveal the kind of a people we are, what he is destined to be. It will reach far beyond this courtroom and this one android. It could significantly redefine the boundaries of personal liberty and freedom, expanding them for some, savagely curtailing them for others. Are you prepared to condemn him and all who come after him to servitude and slavery? Your honor, Star fleet was founded to seek out new life. Well there it sits!
This argument makes Data’s sentience irrelevant to his status as a living being. Data is a new life form, one that we have not experienced. As we have no ability to process or understand what being an artificial life form is like we have to assume that cursing generations of androids to servitude would violate the ethical charter of not only the Federation but the ethical fabric that all civilized human culture is based upon.
Picard would say in a later episode:
The first duty of every Starfleet officer is to the truth, whether it’s scientific truth or historical truth or personal truth! It is the guiding principle on which Starfleet is based.
The truth of Data’s trial is that human beings have a proclivity toward marginalizing and othering that which we do not understand. But, by upholding Data’s right to his own personal liberty and admitting his sentience we find out the greatest truth about ourselves — that despite this limitation of human nature we are able to transcend and overcome it in order to respect and follow the ethical realities and laws that we have created.