Should Sex Robots Be A Thing?
I’m sure this is something you have thought about. When you were in the shower. Or when you’re lying in bed, in the dark, by yourself, or not by yourself. Or when you’re hearing about some new robotic product that has so-and-so features. Or when, you know. But yes, yes, you have thought about it.
If you ever ask me what I think of sex robots, I’d probably say “GOOD QUESTION…” and then send you this article below. This is a final research article I wrote for a seminar on morality in cognitive science last year. It ended up entrenched in many aspects of humanity that I care about personally, and I made my best effort to touch on as many as I could.
This paper is far from perfect (e.g., I only came across the Campaign Against Human Sex Robots after I submitted the paper). Nonetheless, I think it covers many different perspectives, and it’s a little sad that the only other person who’d read it was my professor (the dear, dear, widely-and-deeply-loved Gwen Broude, one of my most cherished idols). So here it is, on the internet.
The Morality of Sex Robots: An Experience-Driven, Human-Centered Approach
Introduction
While to many of us sex robots still feel like something that only exists in science fiction, they might be coming true pretty soon. In fact, some companies have already been selling functioning sex robots (such as http://www.truecompanion.com/), while many others are in the process of creating them (Cheok, Levy, Karunanayaka & Morisawa, 2015). Sexuality, personal robots, and intimate relationships are all morally significant subjects for human beings; since sex robots are relevant to all of those and beyond, it is not surprising that their creation and use raises many ethical concerns (Mackenzie, 2014). These ethical issues urgently need to be addressed because they concern the physical and emotional well-being of their users and have social implications, and because otherwise the motivations driving the design and uses of sex robots would soon be dominated by market demands, which can easily overpower the calls for ethical concerns (Whitby, 2008; Scheutz & Arnold, 2016).
The current state of discourse on these ethical concerns, however, is heterogeneous in its approach and focus. While all the articles I have come across throughout my research cover some important ethical concerns, few of them can draw any definitive conclusions, and most of them admit the conversations have only just begun. The current article has two goals. First, I want to discuss what approach we should take when we address moral problems, and in this case particularly those about personal or social robots. I argue that an embodied and situated perspective is the right one to take; such approach proposes that moral questions are best addressed using imaginative problem-solving, towards a goal of human good as emerging from experience and not preordained by static principles (Johnson, 2015). Thus, this approach requires a focus on the human experience of robots, and not robots themselves in isolation (Coeckelbergh, 2009). Then, following this embodied, experience-centered approach, I review some of the ethical concerns of sex robots that other researchers have raised in light of what they say about how we view sexuality and social relationships, as individuals or a society. Ethical concerns that are not specific to sex robots but rather universal for novel technologies are also discussed.
Throughout this article, I will try to ground my discussion within what we know about the actual human experience and not merely philosophical postulation about them. My options are limited, as we only have little data on human experience with sex robots, but I will try to at least draw analogies from related experiences. Part of the goal is to consider as many factors as possible about the current state of our society and the problems it faces regarding sex robots, and the other part is to imagine a future of human good based on these factors (Johnson, 2015). Therefore, I will discuss possible research that may give more insights into our current situation, as well as possible options to get from our current situation to a desirable future, avoiding undesirable futures.
The Call For An Embodied Approach to Robot Ethics
When people discuss robot ethics, especially when it comes to social robots, the typical lines of argument start with the debate on whether the robots in question are sentient beings, or whether robots can ever be truly sentient (Coeckelbergh, 2009). This consequently affects our decision on whether to grant them rights, legal status, protections and/or responsibilities, and if so, what kinds to grant them (Mackenzie, 2014; Di Nucci, 2016). Some people talk about robots explicitly assuming that they have no agency of their own (which is often true for the current state of technology) and should be considered as objects or machines, although a special kind (Di Nucci, 2016; Whitby, 2008; Scheutz, 2012). Some other people talk about them assuming they can possibly, based on their particular designs, have their own agency and therefore deserve rights and legal protection — if not equal to those for humans, then comparable to those for animals (Mackenzie, 2014; Levy, 2012). Consequently, there is much research interest in methods that assess the intelligence, consciousness or agency in robots, such as the Turing test or the Chinese room test (Coeckelbergh, 2009).
However, there is some issue with this approach to robot ethics. Research and anecdotal evidence has shown that even when a robot is designed to have no agency of its own and does not even look human, it’s still easy for us to bond with it socially, and assign to it sentient features such as the ability to suffer, to feel or to judge. For example, when a bomb-defusing robot (which is not designed to act socially) joins a team of soldiers, soldiers often quickly bond with it, give it a name, talk to it and even introduce it to families and friends (Scheutz, 2012). When it gets destroyed in a mission, some soldiers report feeling extremely upset, as if they have lost a human teammate. Clearly, these soldiers are not only viewing the robot as an object, even though it fails the test of having a mind; their perception of it is perhaps someone who deserves ethical treatment. We also see this pattern of social bonding among people with robotic pets, and even the Roomba vacuum cleaner (Scheutz, 2012). What this pattern shows is that how robots affect us and how we think we should treat them might not depend on whether they are sentient but rather whether we perceive them as sentient; it’s not as much a matter of their internal designs than one of its appearance to us, that is, our experience of them (Coeckelbergh, 2009). Because social robots are typically human-like in their function, behavior, and appearance, this will perhaps be even more of a problem (Coeckelbergh, 2009; Whitby, 2008).
There are good reasons to advocate for an approach to robot ethics (and perhaps all kinds of ethics) that emphasizes the embodied, human experiences and views human good and any principles about it as emerging from those experiences (Coeckelbergh, 2009; Johnson, 2015). First, it does not stand to reason that there exists a generalized rule about what is morally right that did not arise from some experience and is somehow transcendent and fundamentalist; Johnson (2015) provides in his book an excellent account of this argument, which I will not repeat. If all moral principles have emerged from experience, these principles may change or new principles may emerge, because as our world changes dynamically, so do our experiences of what feels good and harmonious, versus what feels bad. Morality, from this perspective, is applying the general problem-solving process, which involves imaginative and predictive mental simulations, to find the option, among all the alternative options, that will lead to the most desirable and harmonious result, which defines human good (Johnson, 2015). In the case of sex robots, or robots in general, the focal point should start from the human experience of robots — how we feel about, think about, and what we do with them — before we worry about whether they are “truly” sentient or deserve rights (Coeckelbergh, 2009).
There are more arguments that undermine the standard approach. It is a hard task, if not impossible, to assess whether an agent truly has a mind or consciousness, or how comparable it is to that of humans (Coeckelbergh, 2009). People’s attitudes and behaviors towards robots, however, can be operationalized and studied with relative ease. I would also argue that a robot’s internal design is only one of many factors in how it appears to us. Essentially, it’s the whole appearance and our experience of it that matters, not just the internal design. If you and I both perceive a robot as human-like and thus deserving of ethical treatment, although its design does not include the ability to suffer, then if I hit it I’m sending a message that I think it’s ok to hit people. Alternatively, if you and I both perceive it as non-human, then my behavior would not send the same message.
I hope these arguments for an experience-driven, human-centric approach of studying robot ethics help frame my following discussion of the current moral problems we are dealing with regarding sex robots. While it would be ideal to consider as many factors as possible, the existing literature on this topic as well as my own opinions on the matter still suffer from limited evidence and perspectives, and this moral problem-solving process is still in its early stages. Additionally, most of the discourse has addressed only a western, industrialized, and heteronormative kind of society.
A “Good” Future With Sex Robots
While there are all kinds of ethical concerns about sex robots, two major benefits have convinced many scholars that there is some possible future with sex robots that is also good for humanity (Levy, 2012).
The first is that sex robots can be a solution for gratifying the positive sexual rights of people who cannot masturbate and do not have access to consensual, mediated sex (Di Nucci, 2016). While historically tabooed, sexuality is an important aspect of the human experience, and there is much support in including positive sexual right — the right to enjoy pleasure from sexual behavior — as a basic human right. However, a dilemma arises when we also want to grant everyone negative sexual rights, the right to not have sex with someone. If there are people who are severely mentally or physically disabled that masturbation is not an option, and (hypothetically, but plausibly) no one is willing to have sex with them, we cannot fulfill their positive sexual rights without violating someone’s negative sexual rights. Di Nucci (2016) proposes that sex robots can be a solution to this paradox, if they can be made sophisticated enough to provide an experience of mediated sex that can adequately fulfill these individuals’ positive sexual rights, but not sophisticated enough to the point where anyone would believe they are appropriate bearers of negative sexual right. He argues that if this fine point of sophistication is difficult to achieve, it is not a normative or moral issue, but rather a technical one, and will resolve when our technology gets there. Additionally, he points out that considering these people are often emotionally and physically vulnerable, sex robots are better than willing humans because they are easier to monitor and control. Beyond this, it’s also possible that sex robots can bring new options for sex education and therapy, which are also needs from a medical perspective (Levy, 2012). To confirm these benefits, nonetheless, much research is needed to learn about the positive sexual needs and attitudes towards robots or mediated sex among the actual individuals who might be eligible for these kinds of services, instead of only theorizing about them.
The second major benefit is that robot prostitutes may be a cleaner, safer substitution for human sex workers (Levy, 2012; Mackenzie, 2014). The sex work industry as it stands now involves much criminality including human trafficking, violence, rape, and other forms of abuse, as well as a high risk of getting sexually transmitted infections (STI) or unwanted pregnancies. While there are many sex workers who entered the industry voluntarily, many of them are under similar risks (Mackenzie, 2014). If sex robots can be designed to provide services that are of comparable quality, they may be considered a superior option over human prostitutes by the clients, because as long as they are washed properly after use, they do not pose the clients any risk of STI or pregnancy, and they may reasonably be perceived as more trustworthy in keeping the confidentiality of the service (Scheutz, 2012). Indeed, in an online survey conducted in the United States, both male and female participants showed significantly positive attitudes towards the idea of using sex robots instead of human prostitutes (Scheutz & Arnold, 2016). Another goal is to reduce the criminality of the sex work industry; the hope is that the robots will then be competitive enough to drive the illegal kind of prostitution out of business. There remains some debate on how this will work out. On the one hand, the cost of each sex robot is much higher than obtaining an individual human sex worker through human trafficking, so arguably there is not enough incentive for human traffickers and abusers to legitimize their business (Mackenzie, 2014). On the other hand, some use this to support a future of robot prostitutes that people pay to spend time with rather than that of personal sex robots that people buy and own, and because there is little additional cost besides maintenance, it’s unlikely those who own these robots will make less money than they would through human trafficking (Levy, 2012). However, there is also little evidence for either side, and future research from the business, economics as well as legal perspectives might be of use (Mackenzie, 2014).
Ethical Concerns About Sex Robots
While these two major benefits of sex robots may seem appealing, sex robots, due to how they relate to some sensitive aspects of the human nature — sexuality and social relationships — raise may more concerns than solutions (Levy, 2012; Scheutz, 2012; Whitby, 2012). Here, I want to discuss these concerns in light of our perception of sex robots as being sexual and social, focusing on their potential impact on us through the two different aspects of our experiences with them. This discussion is based on an assumption that because they have human-like appearance and behaviors, individuals and the society will perceive them as being functionally human-like and interact with them in similar ways to how we would with other human beings; thus, how we treat them would inevitably be reflective of what we believe is appropriate for humans (Coeckelbergh, 2009). After this, I also discuss some additional concerns that mostly have to do with their nature as a novel, digital and physical technology.
1. Sex Robots as Sexual Agents.
Are we ready for sex robots? Some Islamic scholars, for example, have written an extensive article explained from the perspective of Sharia law that engaging in sexual behavior with robots is not only disrespectful to traditional marriage institution and human dignity, unethical, and immoral, but also punishable (Amuda & Tijani, 2012). A conservative Islamic society, then, is perhaps not ready for the propagation of this technology. Meanwhile, people in the United States seem more supportive of the general concept of sex robots and think their use is appropriate in most different cases, as shown in an online survey (Scheutz & Arnold, 2012). Given that the use of objects in sexual activities has been normal historically in the west, and that many participants reported viewing sex robots more as objects than sentient beings, this result is not surprising (Lunceford, 2013). Among these uses, “replacement of human prostitutes” and “use for disabled people” received the highest appropriateness rating. This is especially true for male Americans; on a 7-point scale with 7 being “completely appropriate,” they gave every use of sex robots an average rating of at least 4.6. Female Americans, however, gave lower appropriateness ratings for the uses of sex robots overall. This discrepancy reflects the gender differences in attitudes towards not only sex robots, but also sexuality in general, in the American society, which is related to differential social expectations for different gender roles as well as the sexism and misogyny that underlies them (Scheutz & Arnold, 2012). This has ethical implications not only in the designs and advertising strategies of sex robots aiming for different sexes, but also other sexuality-related subjects. As some scholars have pointed out, our attitudes towards a technology reflect where we currently stand as a society (Whitby, 2008; Scheutz & Arnold, 2012; Levy, 2012; Mackenzie, 2014). This is of utmost importance to learn, according to our embodied approach to robot ethics, because our current situation is the moral problem we aim to solve (Johnson, 2015; Coeckelbergh, 2009).
Even though participants reported viewing sex robots as objects, they reported lower appropriateness ratings towards uses of robots that are normally considered inappropriate in human sex, such as for sex offenders, or the use of robots that look like children or family members (Scheutz & Arnold, 2012). This might be because how the robots are treated is nonetheless perceived as reflective of how humans should be treated, which supports the basis of our approach. If a society permits the practice of normally unacceptable sexual behaviors with sex robots, it will inevitably challenge the society’s norms of what is and is not considered acceptable and ethical (Mackenzie, 2014; Levy, 2012). Unfortunately, there will be people who want to engage in unacceptable sexual behaviors such as pedophilia, zoophilia, rape or violence, and they already do despite how immoral, illegal, dangerous or exploiting these behaviors are (Mackenzie, 2014). If we do not restrict these kinds of behaviors legally, people will inevitably design sex robots that will fulfill the needs for these normally unacceptable sexual behaviors because it is less risky than breaking the law. This is the main concern about sex robots, especially in the context of robot prostitution (Mackenzie, 2014; Levy, 2012).
This ethical problem is threefold (Whitby, 2008). First, how do we decide what we should consider acceptable in the case of sex robots? How different would the standards be from those for humans, or should they be very different? Following our experience-driven approach, there is no way to answer these questions without further research on the feelings, thoughts, and behaviors of actual humans living in a society, but at least we know that’s what we should do. It is important to note that the social norms of acceptable sexual behavior do change, and the related legal restrictions change accordingly (Mackenzie, 2014). Homosexuality was considered immoral and in some places illegal in the west only a few decades ago, but today it is widely accepted as normal and gay marriage is now legal. Secondly, once we have a better idea of what we deem acceptable and unacceptable, to what degrees can we ethically and justly put limits on those behaviors, in the design of the robot or by legal means? For example, if we decide that it’s not ok to have sex with a child-like robot, and then when someone does do that, do we roll our eyes, throw this person in jail, or punish them with a fine? Alternatively, do we just ban the production of child-like robots at all? These questions seem harder to answer, although some scholars have proposed that depending on the degree to which the robot has (or can be perceived as having) agency and thus should reasonably be considered moral patients, different legal status can be granted to protect its rights (Mackenzie, 2014). Third, if we decide to prevent these problems during the engineering stage, what new ethical consequences would arise? For example, if we design robots to be able to record and report any unacceptable sex behavior it detects, the user’s confidentiality is at stake. If we design features that are legally relevant in the human cases, such as consent (Mackenzie, 2014), we might be creating new problems by assigning robots with human traits and pushing for them to be perceived as more human than their programs are, because that would be deceiving the users (Scheutz, 2012). These engineering choices involve many more options as well as much more uncertainty, but taking the time to research and experiment before releasing may effectively control the social impact of the sex robots, and perhaps should even be required (Di Nucci, 2016). Currently, the field of human-robot interaction (HRI) has not found effective ways to study human interactions with sex robots in lab settings, which surely puts some constraints on what we can learn about how these interactions work (Scheutz & Arnold, 2012). We need more conversations on how to proceed, especially about what the appropriate professional and legal regulations should be for the market that distributes sex robots, the industry that provides the services of sex robots, and the tech industry that design and produce sex robots (Mackenzie, 2014; Di Nucci, 2016).
2. Sex Robots as Social Agents.
The sexual relationship is one kind of social relationships and a very intimate one at that. Sex robots belong to the larger category of social robots, and many concerns about social robots in general also apply to sex robots, among which I include here the asymmetrical nature of the human-robot relationship, robot abuse, the effects on human-human relationships and cheating.
Most ethical concerns about social robots regard the potential harm of humans’ unidirectional emotional attachment to robots. At least in the context of the current technology, robots are not capable of forming emotional attachments in a human way, and therefore any kind of impression that they could would be a mere illusion, if not intentional deception (Scheutz, 2012; Whitby, 2012; Levy, 2012; Cheok et al., 2015). While there is interest in algorithms that simulate human emotions, many researchers have addressed it as potentially unethical (Levy, 2012). Some scholars argue for ethical limits on manipulating human psychology in designing robots, including not fooling people to ascribe more feelings to the machines than they should, never lying to the users intentionally to manipulate their behavior, and always being explicit about the principles and limits behind a robot’s algorithms (Cheok et al., 2015). Given the evidence we have discussed earlier in this article and the intimate nature of sexual relationships, it seems reasonable to assume that some people may form a unidirectional attachment to robots. The ethical consequences of this situation are unclear and need more research, but they may include taking away from the attached user’s other social relationships with humans (Whitby, 2008; Levy, 2012).
On the other side of emotional attachment are aggression and physical violence towards robots. This involves the same threefold arguments we have gone through when we discussed what sexual behaviors should be considered acceptable for the use of sex robots, so I will not repeat it. There has been some research on violence directed towards robots; people actually become more abusive when the robot is more human-like, which raises concerns about what message it sends (Whitby, 2008). While there are many similarities between violence in computer gaming and violence towards robots, only tolerance of such violence evolved in the computer gaming case, pointing us to a future where the society is unlikely to intervene when people mistreat robots. Whether that is just or not will, again, depend on whether people eventually perceive sex robots as completely non-human objects, or as having a rights-deserving status (Mackenzie, 2014). One way to avoid this concern is to design robots that always interact in a way so that humans cannot be mad at it (Whitby, 2008). However, we have discussed the ethical issues of this approach. This is quite a dilemma.
In some ways, it seems that a robot that is too lovable might be more problematic than a robot that people feel like they can abuse. If the users like a sex robot and feel attached to it, they may want to spend more time with it, therefore taking away from their relationships with other people or hobbies (Whitby, 2008; Lunceford, 2013). An analogy is how cellphones, while radically redefining the way we communicate with each other, can also make us stare at our screen while we should be spending quality times with our friends. There is an additional concern that, if users who are socially awkward find robot friends or robot lovers adequate sources of social interaction, they might lose the incentives to go out and meet “real” people (Whitby, 2008). Although I think it is a valid hypothesis, this concern assumes some normative standard that “it’s better to interact with real people than with robots” which needs to be confirmed by research on whether this is true for our society, or whether our society believes people are morally obliged to follow norms.
Things become more complicated when a user of the sex robot is also in a committed relationship with a human. Is it cheating if the user has sex with a sex robot? While the subjects in the online survey reported viewing “using sex robots instead of cheating on a partner” as relatively appropriate (with the average ratings of 5.42 for men and 4.37 for women), the subjectivity when it comes to what counts as sexual makes cheating hard to define (Lunceford, 2013). The status of robots as objects do not imply that humans are not capable of feeling intimate with them, and in feeling and being intimate with the robot, emotionally and/or physically, a user is being less intimate with the user’s committed partner, and it is possible for the resulting impact on the relationship to be the same as if the user were cheating with a human person (Lunceford, 2013). If the social norms do not take this kind of “cheating” seriously, it might actually be easier for the user to engage in this transgression and justify it, as the risk of social sanction is not as great. However, it is also possible that the result is different because essentially it all depends on how the user and the partner view sex robots. The partner may perceive sex robots as mere objects and not think it is cheating. The partner may also decide that the user’s emotional attachment to the robot makes this relationship cheating, and that is also valid. The key to this ethical question, then, is proper communication between the user and the partner that makes sure everyone is on the same page. Just like other kinds of polygamous or open relationship, explicit consents should be required from all involved parties, not just those engaging in the sexual behaviors, but also their partners (Lunceford, 2013; Levy, 2012).
These concerns raise another question: who should have access to sex robots? If we can reasonably think that these risks are hard to avoid completely through design, it might be the right decision to limit access to this technology to those who really need it, or pose frequency limits on the users’ access to the robots (Di Nucci, 2016).
3. Sex Robots as a Technology
Sex robots, after all, is an advanced technology. Technologies are not right or wrong or good or bad in their nature, but they can be used in right ways or wrong ways that lead to good or bad human consequences (Di Nucci, 2016; Whitby, 2008). I discuss three concerns here.
The first concern is known as “the responsibility gap.” This issue applies to most autonomous systems and scholars have discussed it comprehensively in the context of military robots (Di Nucci, 2016). The concern is that because different stages of the decision-making process (including designing the robot, ordering its use, and using it) involve different people, and sometimes when the robot malfunctions and causes undesired consequences, it is ambiguous who is responsible. During sexual behaviors, the user is usually in a state of physical and emotional vulnerability; considering that many of the potential users are people who are already disabled, there are good reasons to make sure that things do not go wrong, and if they do, who to hold responsible. There are some agreements that the machine itself cannot be held responsible, at least considering the state of art of the technology today, but whether a commander or the coder is more responsible sometimes remain ambiguous. Di Nucci (2016) proposes, though, that the problem with sex robots are different and more solvable, because the environment is often more controlled, and the task more straightforward and should be ethically well defined before it happens (while in the case of military robots the morality is more ambiguous). He proposes that the designers are responsible for communicating to the decision makers (the physicians and caretakers in this case) the principles of how the robots work so that it should be reasonably easy to predict whether a malfunction will happen or not based on the environmental stimuli and working demands the robot receives. Nonetheless, he admits that uncertainty is ever-present in our dynamic world and that things could get complicated in a legal context.
The second concern is competition with human workers. This issue is often seen with automated technology. In the case of many simple, automated tasks, robots usually have more consistent performance than humans and do not need to rest or be paid, so many companies prefer robots over human workers as a more economical choice of labor. Because the kinds of jobs being replaced by robots often demand specific mechanical skills which are all that the human workers knew, it can be hard for the human workers who lose their jobs to robots to find another job. In the case of sex robot, however, the task is much more complex and nuanced (Levy, 2012). Whether sex robots will be competitive enough to replace human sex workers may depend on factors such as the quality of sexual services they provide, or the society’s differential attitudes toward sex robots and sex workers. It’s likely that human sex workers who chose their careers willingly would not welcome the competition; there might also be concerns that to keep making a living, they will undertake jobs that are legally ambiguous or put themselves at risk of mistreatment (Mackenzie, 2014). At this stage, it is hard to predict what will happen, and research on the attitudes of sex workers about this future competition and willingness to change their careers may give some insights.
The third concern is the problem with privacy and safety of information. Di Nucci (2016), among others, have proposed that medical use of sex robots will allow us to monitor and regulate the sexual experience of the user with ease, but I think there is some potential danger to this idea. Sex is generally considered a practice that demands privacy, and the idea that the robot could record information about the sexual experience may make some users feel uncomfortable. Some research on potential users’ attitude towards the privacy of sex may give insights into this matter. Furthermore, a concern that I have not seen in the literature is the “hackability” of sex robots. If sex robots rely on computer programming, chances are they are hackable, and people can reprogram them to behave differently than what the original designers have in mind. While there is no evidence supporting this possibility, we could draw analogies from how often we see hackers abusing computer programming to hacking into other people’s bank accounts or creating phishing websites. I think this raises more concerns when considering what kinds of hardware sex robots should have on board; for example, if they have cameras on board, even if the original design does not store videos, once hacked, they can pose threats to confidentiality. Perhaps a more alarming question, is it possible that sex robots can be hacked and used to stalk and/or even murder? While these examples are extreme and not grounded in evidence, they speak to the need for professional code or even legal regulations to make sure that sex robots cannot be hacked and employed in unethical ways.
Conclusions: More Information Needed
I hope it is clear that the discussions included in the current article is far from exhaustive, that we have little data on what we (the industrialized western) currently believe or regard as morally acceptable or normal as a society, and that we must carry out more research or other forms of inquiry if we want to make informed, and consequently moral, decisions about whether we want a future with sex robots. If we do want a future with sex robots, we must try our best to imagine how different decisions will play out, until we find some consequent futures that feel harmonious and right, where people can live with minimal conflicts and maximal happiness (Johnson, 2015). Considering our capacity of mental simulation is physically limited, it might also make sense to take advantages of research tools such as computational modeling to boost our ability to explain data, form theories and predict the future (Smaldino, 2017).
I also hope that it is clear that the embodied approach — which is experience-driven and human-centered — is a sensible approach, if not the sensible approach, and that it will be more fruitful than an approach that is detached from a society’s current state and the actual experiences of the people who live in it (Coeckelbergh, 2009). Morality or moral principles are not static, but emergent from our embodied experience of what feels right or wrong, and just as our environment and our experiences change dynamically, they change dynamically (Johnson, 2015). Nonetheless, it is also important to recognize that our human condition means that we never know it all, that we oftentimes do not have the knowledge or skills to solve our problem at hand, that we almost always create new problems with our solutions. However, it is only moral to try our best, to consider all the factors we can, and when we realize something has gone wrong or another problem has come up, to meet the demands of growth, to learn, and become better (Johnson, 2015). Sometimes, we can only learn along the way (Whitby, 2008). The problem of sex robot might be complex, but we will stumble our way through it and end up somewhere, just like we have been doing for many thousands of years.
References
Amuda, Y. J., Tijani, I. B. (2012). Ethical and legal implications of sex robot: an Islamic perspective. OIDA Int. J. Sustain. Dev. 3(06), 19–28
Cheok, A. D., Levy, D., Karunanayaka, K., & Morisawa, Y. (2015). Love and sex with robots. Handbook of Digital Games and Entertainment Technologies, 1–26. doi:10.1007/978–981–4560–52–8_15–1
Coeckelbergh, M. (2009). Personal robots, appearance, and human good: a methodological reflection on roboethics. International Journal of Social Robotics,1(3), 217–221. doi:10.1007/s12369–009–0026–2
Di Nucci, E. (2016) Sexual rights, disability and sex robots. Forthcoming in: John Danaher & Neil McArthur (eds.), Sex Robots. MIT Press.. Available at SSRN: https://ssrn.com/abstract=2814471
Johnson, M. (2015). Morality for humans: ethical understanding from the perspective of cognitive science. Chicago: The University of Chicago Press.
Levy, D. (2012). The ethics of robot prostitutes. In Lin, P., Abney, K., & Bekey, G. A. (Eds.), Robot ethics: the ethical and social implications of robotics (223–231). Cambridge, MA: MIT Press.
Lunceford, B. (2013). Telepresence and the ethics of digital cheating. Explorations in Media Ecology,12(1), 7–26. doi:10.1386/eme.12.1–2.7_1
Mackenzie, R. (2014). Sexbots: replacements for sex workers? Ethical constraints on the design of sentient beings for utilitarian purposes. Proceedings of the 2014 Workshops on Advances in Computer Entertainment Conference — ACE ’14 Workshops. doi:10.1145/2693787.2693789
Scheutz, M. (2012). The inherent dangers of unidirectional emotional bonds between humans and social robots. In Lin, P., Abney, K., & Bekey, G. A. (Eds.), Robot ethics: the ethical and social implications of robotics (203–221). Cambridge, MA: MIT Press.
Scheutz, M., & Arnold, T. (2016). Are we ready for sex robots? 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI). doi:10.1109/hri.2016.7451772
Smaldino, P. E. (2017). Models are stupid, and we need more of them. In R. R. Vallacher, A. Nowak, & S. J. Read (Eds.), Computational Models in Social Psychology. East Sussex, England: Psychology Press.
Whitby, B. (2008). Sometimes it’s hard to be a robot: A call for action on the ethics of abusing artificial agents. Interacting with Computers,20(3), 326–333. doi:10.1016/j.intcom.2008.02.002
Whitby, B. (2012). Do you want a robot lover? : the ethics of caring technologies. In Lin, P., Abney, K., & Bekey, G. A. (Eds.), Robot ethics: the ethical and social implications of robotics (233–248). Cambridge, MA: MIT Press.
