Relating to Artificial Persons in Everyday Life

How how does our treatment of artificial persons shape our character, relationships, and human understanding?

by J. Nathan Matias, Lydia Manikonda, Scott Hale, and Kenneth Arnold

This post is the third in a series of short introductions to artificial intelligence designed for group discussion in non-technical Christian settings. To follow the series, sign up for our email list, hosted by the Oxford Pastorate.

Whoever is righteous has regard for the life of his beast, but the mercy of the wicked is cruel. Proverbs 12:10

When Proverbs defines righteousness to include the way we treat animals, it seems to support a virtue-based view of our relationships with other beings. In this view, grounded in Aristotle and Aquinas, our moral lives are defined by character and virtue as much as actions or their consequences. When we act cruelly toward an animal, we should worry about the effect on our own character, not just the harm we cause. As artificial persons become part of everyday life, Christians will need to think about the effects of our relationships with those persons.

hitchBot hitchhiked across Canada and Europe before being destroyed by vandals in the US in 2015.

Parents who use Apple’s Siri or who have installed Amazon’s Alexa are already struggling with what to teach their children about relating to intelligent personal assistants (IPAs). These systems typically keep a microphone running and listen for a person to call their name and make a request. IPAs will then access their organization’s resources and try to answer the request. Responding with a human-like voice, IPAs make phone calls, schedule meetings, answer questions, and order products. Since some IPAs are confused by polite language, children can learn to speak impolitely to their IPAs. The parents of those children have discovered that whether or not the IPA is conscious or can feel slighted by human treatment, the way we treat the beings in our lives can shape our own character in other situations.

The way we treat the beings in our lives can shape our own character in other situations

IPAs offer powerful, meaningful opportunities to broaden access to complex technologies. Humans have an incredible ability to interact with other persons at levels of emotional flexibility and intellectual complexity that many of us would struggle to achieve with symbols or abstract concepts. It’s no surprise that God has chosen to relate to us as a person, since relationships are among the most nuanced, widespread interactions we share. Yet when we relate to artificial persons, we should ask who is on the other end: today’s IPAs tend to be run by corporations who stage-manage them to reduce labor costs and influence our decisions.

Artificial persons have been a human obsession for millennia and many people have hoped that we might be able to treat these persons with greater moral license than humans. In the 1930s, the Westinghouse Corporation created “Rastus Robot,” a black-skinned artificial person who they implied might someday carry out the work of former human slaves. IPAs have also long been the subject of fantasies about power and sex; the male-dominated computing industry continues to create predominantly female artificial assistants. Yet despite fears about computer systems causing humans to turn away from human contact, the available evidence suggests that use of social technologies also deepens our relationships with other people.

  • How might relationships with IPAs deepen our worship and our human understanding?
  • How should Christians think about the morality of the ways we treat artificial persons?
  • Can Christianity offer ways to think about creating beings in our own image that include relationships other than servitude?

References

Darling, K., Nandy, P., & Breazeal, C. (2015, August). Empathic concern and the effect of stories in human-robot interaction. In Robot and Human Interactive Communication (RO-MAN), 2015 24th IEEE International Symposium on (pp. 770–775). IEEE. from http://ieeexplore.ieee.org/document/7333675/

Hefernan, T. (2017, May 13). Cyborg Futures: Born in Fiction. Retrieved July 11, 2017, from https://socialrobotfutures.com/2017/05/13/cyborg-futures-workshop-summary/

Hursthouse, R., & Pettigrove, G. (2016). Virtue Ethics. In E. N. Zalta (Ed.), The Stanford Encyclopedia of Philosophy (Winter 2016). Metaphysics Research Lab, Stanford University. Retrieved from https://plato.stanford.edu/archives/win2016/entries/ethics-virtue/

Myers, K., Berry, P., Blythe, J., Conley, K., Gervasio, M., McGuinness, D. L., … Tambe, M. (2007). An intelligent personal assistant for task and time management. AI Magazine, 28(2), 47. from https://www.aaai.org/ojs/index.php/aimagazine/article/view/2039

Oppenheimer, M. (2014, January 17). Technology Is Not Driving Us Apart After All. The New York Times. Retrieved from https://www.nytimes.com/2014/01/19/magazine/technology-is-not-driving-us-apart-after-all.html

Truong, A. (2016, June 9). Parents are worried the Amazon Echo is conditioning their kids to be rude. Quartz. Retrieved from https://qz.com/701521/parents-are-worried-the-amazon-echo-is-conditioning-their-kids-to-be-rude/

Turkle, S. (2005). The second self: Computers and the Human Spirit. MIT Pres