The Emerging Artificial Intelligence Wellness Landscape: Opportunities and Areas of Ethical Debate
Published, Cal. W. L. Rev. (2018)
This paper was presented on February 19th, 2018 at the California Western School of Law “AI Ethics Symposium”
Lydia Kostopoulos, PhD
Over the past decade, there has been a surge of new wellness technologies, catering from the individual to spas and hotels. Wearables, with sensors to monitor steps, heart rate, sleep and temperature grew significantly in popularity. Similarly, there has been a boom in technologies that aid sleep and a plethora of new pleasure technology. Within the past five years, many wellness technologies have increasingly become fashion forward from rings to necklaces capable of measuring your mood, heart rate and steps. As the cognitive technologies improve, the wellness technology market is now seeing its early first wave of wellness technologies that incorporate artificial intelligence (AI).
Leading AI Scientist, Andrew Ng, compares Artificial intelligence to electricity and expects that it will change the way the world operates much like electricity did. IBM CEO Ginni Rometty sees IBM Watson’s AI services as a $2 trillion opportunity. Forrester’s Research sees AI sparking an insights revolution, where the data derived will drive change across companies, deliver personalized customer service, and ultimately, increase profits. An entire book could be written on how businesses, services and markets will be transformed by AI, but what is AI? Oxford dictionary defines Artificial Intelligence as:
The theory and development of computer systems able to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.
What does Artificial Intelligence mean for Wellness technologies, and what are the ethical implications? This paper attempts to answer this question by examining certain technologies and ethical questions. The technological scope of this paper provides examples of AI technologies that deliver wellness value without another human being involved. This paper makes a deliberate effort to examine the intersection between AI technologies as stand-alone offerings in the wellness market. Considering that the inclusion of AI into wellness is a new addition to the expanding wellness service and product offerings, it is an opportune moment to proactively discuss some of the emerging ethical questions. In efforts to better discuss AI uses in wellness, this paper explores them as they fall into three categories: intangible, tangible and embedded.
II. DEFINING WELLNESS
Wellness has become a popular buzzword in the past couple years, and a word more frequently quoted as a goal by individuals, workplace human resources, and hotel chains. This rise in interest for wellness has coincided with a period where there have been increasing reports of economic struggles, a volatile labor market and a growing number of people feeling anxious or depressed. Many definitions of wellness can be summarized into Merriam-Webster’s definition of wellness: “the quality or state of being in good health, especially as an actively sought goal.” If wellness is being in good “health,” then it begs the question: What is health? The internationally recognized institution for health, the World Health Organization (WHO), defines health as “a state of complete physical, mental, and social well-being, and not merely the absence of disease or infirmity.” 
However, now that ‘wellness’ has become increasingly popular, institutions and global studies have been conducted on the topic and have come to their own conclusions as to what exactly wellness is. There are two organizations worth noting, first, the National Wellness Institute, which adopts its co founder’s (Dr. Bill Hettler’s) perspective on wellness. Dr. Hettler believes that wellness is comprised of the following six dimensions: emotional, spiritual, intellectual, occupational, physical, and social. This multidimensional perspective is part of a wellness understanding in which Dr. Hettler assumes to be a “conscious, self-directed and evolving process of achieving full potential.” This perspective is very compatible with the contemporary “Do-It-Yourself” (DIY) culture and even more so with the advent of a DIY AI therapy app.
The Organization for Economic Co-operation and Development (OECD) believes that well-being is a critical component to economic development and launched the OECD Better Life Initiative to measure well-being and progress in achieving it. They are measuring “quality of life” and “material conditions.” Within “quality of life” they measure: health status, work-life balance, education and skills, social connections, civic engagement and governance, environmental quality, personal security, and subjective well-being. Within “material conditions,” they measure: income and wealth, jobs and earnings, and housing. These elements build the index that they see as part of a cycle with resources to sustain well-being over time through the preservation of: natural capital, economic capital, human capital, and social capital.
While relevant and thoroughly multi-disciplinary, these comprehensive approaches evaluate wellness from a holistic perspective. However, wellness technologies that are emerging with AI are not as comprehensive. As such, this paper will seek to explore AI and wellness technologies through the lens of the WHO, which examines mental, physical and social health.
III. INTANGIBLE, TANGIBLE, AND EMBEDDED ARTIFICIAL INTELLIGENCE.
This paper has divided Artificial intelligence (AI) into three mediums in an effort to more closely assess emerging AI technologies used within the field of wellness. The three mediums are intangible, tangible and embedded.
The technological examples discussed represent a few of the many emerging technologies that are coming out. Many people are familiar with smart watches that report specific metrics and use machine learning to make health predictions. One medical example is Fitbit and Apple Watch studies on how the data analytics can predict a user’s risk of diabetes. But the technologies explored in this paper examine those that directly and primarily relate to mental and emotional health, which this research perceives to play an important role in underlying social health problems, and are important factors in physical health problems that are not genetic in origin. The research discussed highlights some of the latest emerging AI capabilities in the space of wellness. The following sections expand on each of the three AI categories with technological examples, their benefits to wellbeing, and their ethical implications:
A. Intangible AI Wellness
For the purpose of this research, intangible AI does not have a physical form, instead it can be communicated through a sound, a notification on a device, and/or invisible computation running in the background and called upon on demand for information or advice.
According to the American Foundation for Suicide Prevention (AFSP), suicide has become a leading cause of death in America, and the number of deaths by suicide is rising. In a study on antidepressant use in the United States, the National Center for Health Statistics at the Center for Disease Control and Prevention, found that:
Antidepressants were the third most common prescription drug taken by Americans of all ages in 2005–2008 and the most frequently used by persons aged 18–44 years. From 1988–1994 through 2005–2008, the rate of antidepressant use in the United States among all ages increased nearly 400%.
In efforts to address these growing concerns, clinical research and start-ups alike have started to explore the utility of AI technology to identify, diagnose, prevent, manage, and solve these problems. One such initiative comes from the AI startup Mindstrong, which sees the smartphone as an emotional diagnostics device that can help with mental health and wellness. President and co-founder, Dr. Thomas Insel (former Director of the National Institute of Mental Health), hopes to leverage AI to predict emotional health concerns before they arise. To achieve this goal, Dr. Thomas is attempting to bridge visits to the therapists and daily life through early warnings that are derived from measurements and assessments from digital phenotyping. Digital phenotyping is the “moment-by-moment quantification of the individual-level human phenotype in situations using data from personal digital devices.” In other words, Mindstrong aims to assess phone use patterns (typing, scrolling, etc.) when someone is relaxing and when someone is upset. These types of patterns can help paint a picture of a user’s emotional state.
At the University of Southern California (USC) Institute for Creative Technologies they are conducting research to create what they have dubbed a “virtual therapist” named Ellie. Through a webcam and microphone, the AI is able to process and analyze the emotional cues derived from the patient’s face and the variation in expressions and tone of voice. Used only in research settings, they found that when speaking with Ellie, patients “feel less judged by the virtual therapist and more open [to discussing their feelings].” The institute’s social psychologist, Gale Lucas, explained that “It’s about what’s happening in the moment — having a safe place to talk.” 
Outside the research lab there is an app called Addicaid, which is meant to help its users avoid addictive behaviors that are destructive to their mental, emotional, and physical wellbeing. Addicaid has the potential to help people who experience the following: (1) substance abuse and process disorders, including but not limited to food, (2) gambling, (3) internet usage, (4) alcoholism, and (5) pornography. Leveraging clinical research, machine learning and adaptive AI “Addicaid predicts when a person might be at risk of falling into addictive behaviors and offers personalized treatment options for that individual.” For example, for someone who is struggling with alcohol abuse, using GPS, the app can intervene when the user is approaching or entering a trigger location (such as a bar or liquor store). In these instances, the app would provide information for a hotline and offer additional coping tools and techniques.
If calling a hotline and speaking to a human is problematic, the WoeBot app is available for therapy sessions 24/7. The product of a team of Stanford psychologists and AI experts, WoeBot tracks its users’ mood through brief daily chat conversations and offers curated videos and word games to assist users in managing their mental health. Alison Darcy, Woebot CEO and psychologist, believes that Woebot has the potential to improve on human therapists because there is “a lot of noise in human relationships… [and] noise is the fear of being judged.” Woebot leverages a more deterministic conversational approach with open ended questions such as “How are you feeling?,” “What is your energy like today?,” which are question prompts that have been modeled on talk therapy and cognitive behavioral therapy.
Unlike the other apps described, Therachat leverages an AI chatbot to serve as a means to augment therapy sessions instead of replace them. It has a “smart journaling” tool which is HIPAA compliant and is customizable so that the therapist can assign homework to the patient, for example, to participate emotion tracking. The therapist would receive status updates and reports through the platform and incorporate this data into the sessions.
Within the space of hotel wellness, Marriott is exploring a concept it is dubbing, the “hotel room of the future.” In this room, “Marriott would let guests control everything from the temperature of the shower to the color of the light with the sound of their voice.” The settings and preferences of the guests would be saved and AI would be able to draw from them during their next stay. For example, if a guest normally likes to meditate in the morning, using the room lighting, the AI could wake the guest up gently in time for their meditative session. At this point it is still a prototype, but existing technology also makes it possible for the AI to gather information about the guest that is publicly available on social media. With time, these services will unfold and guests will be offered options to connect their hotel AI profile to their social media profile for more services, which apart from sentiment analysis, would also include precision advertising, and recommendations for hotel wellness promotions and dining options that cater to the guest’s dietary specifications.
All these technologies are useful approaches in improving emotional wellness, but they do not come without ethical implications. In an era of prevalent hacking and data leaks, the highly sensitive data gathered by such AI apps and interfaces must be treated with extra care. In cybersecurity it is said that it is not a matter of if an organization will be hacked but when. Apart from data security implications, there are other ethical questions relating to the human aspect. There is something about the authenticity of another human’s human experience that adds value to an experience, be it a therapy session or a hotel stay.
An algorithmic bot does not understand vice and has never felt heart break or had to bury a loved one. While having an algorithmic bot therapist 24/7 is convenient and helpful to let one’s feelings and emotions out in real-time while they are occurring, it cannot be substituted as a form of human empathy. In the case of addiction, while an app may assist in thwarting some circumstances of addiction–and there are some people who can benefit from a nudge–one cannot ignore the holistic picture of an individual’s life and the emotional or lifestyle triggers that would prompt the individual to engage in addictive behavior in the first place. Human therapists make judgement calls throughout sessions and are able to be agile in treatments and recommendations. While therapists and bots can help, friends and family may also serve as an important support system, something that is sometimes forgotten as these technologies are designed.
Although the hotel room of the future embedded with AI wellness tools for guests will create a unique, personalized service at a hotel, the hyper-personalized wellness services that build a relationship between the guest and the AI capability could further expand the “digital bubbles” we live in. This technology has the potential to make its users less accommodating and increase feelings of entitlement. In this regard, it may unfairly raise expectations of human service to be on par with AI memory and instant personalization.
At its current state, the intangible AI and emotional wellness technology is more rigid and does not have the response fidelity one can expect from a human. This in no way means that these technologies should not exist, or that they do not serve a useful purpose. Instead, it creates a new space and an opportunity for them to be woven into other aspects of life, whether it is feeding actionable information to a therapist or wellness professional; or working towards personal wellness goals. Another aspect to these technologies has to do with the rise in anxiety and psychological issues in the United States today. AI wellness technologies can help make therapy more physically and financially accessible, but it could potentially serve as an illusionary form of therapy that fails to produce results. There should be metrics in which the user would be able to hold themselves accountable for their progress, or lack thereof, while using these technologies. Like with everything else in life, it is not a ‘one size fits all’ model, the danger comes when we pretend it is.
B. Tangible AI Wellness
This research sees tangible AI as embodied in a physical form, which humans can interact with. It can be AI in a vehicle, robotic pet, doll, factory equipment, etc. As with intangible AI, the wellness landscape for tangible AI is still an emerging commercial market.
Quality sleep has acquired the spotlight over the past few years, particularly with Arianna Huffington’s book, “The Sleep Revolution,” which discusses cultural problems with sleep, while setting forth a strong case for the value of sleep to our health, longevity and cognitive capacity. One of the sleep technologies she recommends is S+ by ResMed, which is a contactless device that is placed on the nightstand and monitors breathing, room temperature, noise and quality of sleep. According to Michael Wren, the ResMed Senior Director of Technology and Operations:
“The idea is that every breath you take is sent to the cloud and, over time, the technology can grow a picture based on the data about the person’s sleep, environment, activity and stress. These all build into an artificial intelligence algorithm in the cloud and the user gets a personalized train of feedback.”
Another tangible sleep technology is the ZEEQ smart pillow, which is smart home compatible and can communicate with other AI devices such as Amazon Alexa for voice control sleep reports. It tracks sleep (duration, snoring volume, and movements) to determine a baseline of habitual sleeping and establish what is a normal sleep cycle for the user. The data derived from the sleep tracking can be visualized in the app and is used for sleep analysis to determine sleep duration, the user’s sleep cycles, and snoring impact on restfulness. The results are reported daily and collated over the course of time. Given this data, the pillow will activate its smart alarm at the most optimal moment during the sleep cycle.
There are some who no longer trust their own feelings about the quality of their sleep, instead they would have to first sync their sleep device, receive the sleep report, and then determine how they slept. Our bodies speaking to us and notifying us about pain and discomfort has played an important role in our evolutionary path. Ignoring what our instinct says could be harmful for our health. The more authoritative data and analysis we derive from AI and devices focused our wellbeing, the higher the chance that we relinquish our own authority and judgment regarding wellbeing to technology. This poses an ethical concern as people may try to fit themselves into the “box” deemed normal and healthy. This box will be determined by the algorithms and data from studies that may not be representative of an entire population, or account for differences in lifestyle or DNA.
There have been AI wellness robots that have been successfully used, namely Paro the therapeutic seal, which provides patients an opportunity to take advantage of the documented benefits of animal therapy without having to leave their care facilities, or deal with the challenges of owning live animals. Paro is a robot seal with a soft, furry coat that purrs when being caressed and feels good to pet. It has sensors to follow the patient with its eyes and responds to being touched. Paro has been used to comfort patients with dementia and reduce stress to both the patient and their caregivers. The AI gives Paro the “ability to ‘learn’ and remember its own name, and it can learn the behavior that results in a pleasing stroking response and repeat it.”
While there are helpful devices and automated machines in health-care, Stevie the AI elderly care robot seems to be a category of AI robots that is growing. Created by Trinity College in Dublin, Stevie the elderly care robot “can perform a range of functions ranging from medication reminders and light conversation to video calls with family members.” Its face, which is a screen, can also provide picture prompts for those who have hearing impairments. Should a patient or user become unresponsive, Stevie has the capability to contact emergency services. Apart from reminding its users to take their medicine and being a video messenger medium to call family and friends, AI allows it to provide a form of companionship through conversational skills. Companionship features are increasingly sought after for the aging population demographic–which often feels isolated–and could revolutionize home care for the elderly.
While such robots have utility and can contribute to the well-being of a wide range of users, it is replacing a form of care and attention that has always been, since the beginning of time, exclusively given by humans. Today people are living busy lives with jam packed schedules, and these care robots can give loved one’s peace of mind. However, passing on care to robots is a form of responsibility transfer. As such, it is important for each family to thoughtfully evaluate how much care is being transferred to a robot.
Companion robots, however, have not been made solely for elderly care. RealBotix is is a company which sells custom life size female sex dolls with customizable personalities. Customers can order very realistic looking and feeling life size female dolls exactly to their preferences, from skin color, to breast cup size, waist size, rear end mass, hair color, eye color, and even public hair style. The dolls are for sexual use and have been manufactured for several years, but have been recently upgraded with AI technology. The doll comes with an app that allows users to customize the personality with several options and adjustments to choose from; some personality trait examples include: cruelty, humility, meanness, patience, courage, charm, tenacity, sensuality, and aggression to name a few.
Putting aside the personal use of these sex dolls, they also have other uses, one being use as a sex surrogate. Surrogate partner therapy is a three-person therapeutic team involving the client, the therapist, and a surrogate partner. In this form of therapy “the surrogate participates with the client in structured and unstructured experiences that are designed to build client self-awareness and skills in the areas of physical and emotional intimacy.” For those who have sexual traumas and mental difficulties with intimacy, AI sex robots can be used as an alternative form of surrogacy in place of a human being. Additionally, the AI software can be trained to have conversations relating to sexual trauma and intimacy problems. Another utility for sex robots is in nursing homes where patients have lost the independence and privacy they used to have living in their own homes. Sexual well-being is also a part of wellness and AI enabled sex robots can provide that to patients in nursing homes.
These AI enabled sex robots, can also be companions. Particularly for those who are socially less confident, as well as those who live alone and would like to have “someone” to talk to at home. With the rise in loneliness, this could be another example where technology offers a solution. However, while the AI sex robot has the potential to offer an alternative form of sexual intimacy and conversational companionship, it does pose some ethical questions relating to wellness and emotional well-being. Customizing an AI personality to one’s needs can be a dangerous habit to become accustomed to, as it is not possible to go to another human being and adjust their personality so it is more suitable to one’s preferences. Would people choose to spend time with their personalized AI companion over other human beings? If so, how much of a problem would that be for society? Would conversation with an AI enabled human looking robot doll alleviate the feelings of loneliness from lack of a human connection? AI will eventually get to the point of a more meaningful conversation and maybe people will prefer to marry AI robots. The robots with machine learning will learn and grow during the course of the relationship. But would it ultimately turn out to be a relationship with a non-consenting synthetic robot with simulated intelligence and no consciousness? This is very different than a relationship with a consenting human being who is sentient, conscious, and has self-determination.
Not to be outdone by the health and sex industry, hotels are also exploring the idea of AI enabled robots. In robot friendly Japan, there is a hotel called Henn na Hotel, which means “Weird Hotel,” and it is almost exclusively staffed with robots who check in guests, deliver their luggage to their room and answer their questions and requests in the room. From the anthropomorphized dinosaur at the reception desk, to the robot luggage carrier, and the mini night stand robot that helps adjust the temperature and light among other things, the entire experience can be a human-less one. One ethical question to ask is, Can there be too much technology? What is the right balance between human and machine? There are countless known and unforeseen variables that come into play on a daily basis, making human life chaotic. A hotel that wishes to offer its guests calm and peace of mind would do well to have humans who can better respond to ambiguity and human emotion.
C. Embedded AI Wellness
Embedded AI is when AI is fused with our brain either through an invasive or non-invasive mechanism. While it remains a technology in its very early stages, it is a form of brain computer interface (BCI) that has the capability to augment the human brain (intelligence, mood, etc.).
The Defense Advanced Research Projects Agency (DARPA) is exploring research in neuroscience and brain computer interface under their Brain Initiative. Some of the research projects in the initiative focus on neural connections that can be stimulated and/or interacted in a way that produces a healing response or a sensation to areas where there is none.
Among other companies, Neuralink, is actively pursuing research to connect the brain to the computer and leverage the AI capability with the brain. Specifically they are “developing ultra high bandwidth brain-machine interface to connect humans to computers.” This technology will create a new understanding of emotional and mental well-being. Given scientists’ knowledge on what neurons trigger happiness, having BCI capability may create other forms of addiction and some people may not be able to start their day without a “happiness” fix. In theory, happiness is an elevated joyful state that is above the normal state, and if the happiness feeling could be induced artificially on command, would the individual abusing it be only able to feel neural induced happiness? It is difficult to predict what having AI connected to the brain would mean for wellness, but safety measures could be put in place so that humans could not auto-pilot their way out of dealing with emotional distress. If this were the case, it could arguably create an emotionally and mentally weaker person, which could in turn (at a mass scale) could effect evolution.
Understanding the brain and being able to map it and connect it to a computer will create another avenue to immortality. The theory is that one would be able to upload their brain through whole brain emulation. This technology is currently being developed but does not exist yet; an AI services such as that would leverage AI to create re-enlivened forms of deceased people on their social media. Companies such as Eterni.me, help their clients become “virtually immortal” through a curated intelligent avatar. This technology is useful for those who feel they are not ready to part with loved ones who have passed on, and it has the potential to augment a person’s emotional and mental well-being as they continue to remain close to a deceased family member or friend. It also has the potential to maintain a relationship with a therapist who has passed away. While keeping memories alive and creating new ones with loved ones whose memories and digital data have been AI enabled for digital re-enlivenment, this creates a reality dissonance with mortality and relationships with the dead. Finally, it has the potential to postpone or halt the grieving process, which may be harmful to an individual’s emotional and mental well-being. Time will tell how this technology will be adopted by culture and society.
As supporters of technological solutionism continue to promote new and innovative technologies, it is important for society to turn back to the basics and challenge the fundamental assumptions about the problem they are attempting to address with technology.
Considering AI in wellness technology is only a few years old, there is an opportunity at the early stages to have impactful discussions regarding AI wellness before these technologies become more and more a part of daily life. Discussions should be had by technologists on the premise of human wellness and well-being, including how technology can ethically augment wellness without side effects.
Wellness is very much a human endeavor, and as we seek to replace wellness professionals with AI (in whatever form), thought should be given as to how that could take away from the lasting effects of the wellness experience. Perhaps a hybrid AI-human wellness professional would be the best combination to leverage the best of both worlds.
The start of the 21st century has been filled with rapid technological change, and questions that challenge us to ponder the status quo of human relationships and the essence of humanity. This means we should bravely interact with new technologies, those who create them, and our community, with a tremendous amount of courage to look within our humanity, as uncomfortable as it may be.
Meaning is assigned to things, events, people and life at an individual, societal and cultural level. At such an early stage, there is an opportunity to shape the meaning of new technologies before they are haphazardly assigned empty stereotypical meanings. Resolving the ethical aspects of these technologies as they pertain to wellness will have to be a deliberate and communally inclusive effort that challenges social norms. Ethical conundrums such as these are best placed in the #BetterWhenShared category. This research hopes to contribute to the conversation and further the discussion as continued research develops.
Dr. Lydia Kostopoulos’ (@LKCYBER) work lies in the intersection of people, strategy, technology, education, and national security. Her professional experience spans three continents, several countries and multi-cultural environments. She speaks and writes on disruptive technology convergence, innovation, tech ethics, and national security, and founded Sapien21 (www.sapien21.com) which encompasses these aspects. She is an advisor to the AI Initiative at The Future Society at the Harvard Kennedy School, participates in NATO’s Science for Peace and Security Program, is a member of the FBI’s InfraGard Alliance, and during the Obama administration has received the U.S. Presidential Volunteer Service Award for her pro bono work in cybersecurity. www.lkcyber.com
 Lynch, Shana. (2017). Andrew Ng: Why AI is the new electricity. The Dish: Stanford News. https://news.stanford.edu/thedish/2017/03/14/andrew-ng-why-ai-is-the-new-electricity/
 Darrow, Barb. (2016). Through machine learning, IBM Braintrust sees better days ahead. Fortune. http://fortune.com/2016/02/25/ibm-sees-better-days-ahead/
 McCormick, James. (2017). Predictions 2017: Artificial Intelligence Will Drive the Insights Revolution.: Advanced insights will spark digital transformation in the year ahead. Forrester Research. https://go.forrester.com/wp-content/uploads/Forrester_Predictions_2017_-Artificial_Intelligence_Will_Drive_The_Insights_Revolution.pdf
 Oxford Dictionary. (2017). Definition of Artificial Intelligence. https://en.oxforddictionaries.com/definition/artificial_intelligence
 The Economist. (2015). “What slowing trade growth means for the world economy.” https://www.economist.com/blogs/economist-explains/2015/09/economist-explains-10
 World Economic Forum. (2016). “The future of Jobs: Employment, Skills and Workforce Strategy for the Fourth Industrial Revolution.” Global Challenge Insight Report. http://www3.weforum.org/docs/WEF_Future_of_Jobs.pdf
 Singal, Jesse. (2016). “For 80 years, young Americans have been getting more anxious and depressed, and no one is quite sure why.” The Cut: Mental Health. https://www.thecut.com/2016/03/for-80-years-young-americans-have-been-getting-more-anxious-and-depressed.html
 Merriam-Webster Dictionary (2017). ”Wellness Definition” https://www.merriam-webster.com/dictionary/wellness
 National Wellness Institute. (2017) http://www.nationalwellness.org/?page=six_dimensions
 Hettler, Bill. (2017). “The Six Dimensions of Wellness.” National Wellness Institute. http://www.nationalwellness.org/?page=six_dimensions
 OECD. (2017). “Measuring Well-Being and Progress.” OECD Better Life Index. https://www.oecd.org/statistics/measuring-well-being-and-progress.htm
 Hobbs, Andrew. (2018). “Fitbit and Apple Watch can help predict diabetes risk, study reveals”. Internet of Business. Retrieved from https://internetofbusiness.com/deepheart-fitbit-apple-watch-predict-diabetes-risk/
 Center for Disease Control and Prevention. (2011). “Antidepressant Use in Persons Aged 12 and Over: United States 2005–2008. National Center for Health Statistics. Retrieved from https://www.cdc.gov/nchs/data/databriefs/db76.htm
 Al Idrus, Amirah. (2017). “AI startup Mindstrong bags $14M for mental health mission. Fierce Biotech — MedTech. Retrieved from https://www.fiercebiotech.com/medtech/ai-startup-mindstrong-bags-14m-for-mental-health-mission
 Torous, John. Staples, Patrick. Onnela, Jukka-Pekka. (2015). “Realizing the Potential of Mobile Mental Health: New Methods for New Data in Psychiatry”. Current Psychiatry Reports — Psychiatry in the Digital Age. Springer. https://link.springer.com/article/10.1007/s11920-015-0602-0
 Tieu, Andrew. (2015). “We Now Have an AI Therapist, and She’s Doing Her Job Better than Humans Can” Futurisum — Enhanced Humans. Retrieved from https://futurism.com/uscs-new-ai-ellie-has-more-success-than-actual-therapists/
 Walravens, Samantha. Cabot, Heather. (2017). “How this CEO is using artificial intelligence to treat addiction”. Women @ Forbes. Forbes. Retrieved from https://www.forbes.com/sites/geekgirlrising/2017/11/02/from-addict-to-a-i-entrepreneur-how-this-ceo-is-using-technology-to-treat-addiction/#529a3890408e
 Molteni, Megan. (2017). “The Chatbot therapist will see you now”. Science — Wired Magazine. Retrieved from https://www.wired.com/2017/06/facebook-messenger-woebot-chatbot-therapist/
 Pennic, Jasmine. (2017). Therachat Unveils AI-Chatbot Platform to Augment Therapy Sessions. HIT Consultant. Retrieved from http://hitconsultant.net/2017/06/27/therachat-launches-ai-chatbot-platform/
 Zimmermann, Joe. (2017). “Marriot Unveils ‘Smart’ Hotel Room Prototypes, With Personalized Presents and Voice Control”. Bethesda Magazine. Retrieved from http://www.bethesdamagazine.com/Bethesda-Beat/2017/Marriott-Unveils-Smart-Hotel-Room-Prototypes-With-Personalized-Presets-and-Voice-Control/
 Townsend, Chris. (2018). “Federal Agencies Bracing for New Cyber Challenges in 2018” Symantec. Retrieved from https://www.symantec.com/blogs/feature-stories/federal-agencies-bracing-new-cyber-challenges-2018
 Huffington, Arriana. (2018). The Sleep Revolution. Arianna Huffington.com Retrieved from http://ariannahuffington.com/books/the-sleep-revolution-tr/the-sleep-revolution-hc
 O’Connel, Claire. (2017). “Sleep trackers dig for better data”. The Irish Times — Business. Retrieved from https://www.irishtimes.com/business/innovation/sleep-trackers-dig-for-better-data-1.3150888
 Kapfunde, Muchaneta. (2017). “ZEEQ, The Anti-snoring Smart Pillow That Streams Music and Tracks Your Sleep” Fashionnerd. Retrieved from https://fashnerd.com/2017/11/zeeq-smart-pillow-streams-music-stops-snoring-tracks-your-sleep/
 Griffiths, Andrew. (2014). “How Paro the robot seal is being used to help UK dementia patients.” The Guardian — Society. Retrieved from https://www.theguardian.com/society/2014/jul/08/paro-robot-seal-dementia-patients-nhs-japan
 D’Arcy, Ciarán. (2017). “’A cute little fecker’: Trinity’s Stevie the robot helps older people. The Irish Times. Retrieved from https://www.irishtimes.com/business/technology/a-cute-little-fecker-trinity-s-stevie-the-robot-helps-older-people-1.3290009
 McgGinn, Conor. (2017). The Robot that could revolutionise home care for elderly people. The Independent. Retrieved from https://www.independent.co.uk/life-style/health-and-families/health-news/the-robot-that-could-revolutionise-home-care-for-elderly-people-stevie-us-a8068931.html
 International Professional Surrogates Association. (2018). Surrogate Partner Therapy. Retrieved from http://www.surrogatetherapy.org/what-is-surrogate-partner-therapy/
 Associated Press. (2015). “Japan’s robot hotel: a dinosaur at reception, a machine for room service “. The Guardian. Retrieved from https://www.theguardian.com/world/2015/jul/16/japans-robot-hotel-a-dinosaur-at-reception-a-machine-for-room-service
 DARPA. (2018). DARPA and the Brain Initiative. DARPA — DOD. Retrieved from https://www.darpa.mil/program/our-research/darpa-and-the-brain-initiative
 Sandberg, Anders., Bostrom, Nick. (2008). “Whole Brain Emulation: A Roadmap”. Future of Humanity Institute. Retrieved from http://www.fhi.ox.ac.uk/brain-emulation-roadmap-report.pdf
 Meese, James., Nansen, Bjorn., Kohn, Tamara., Arnold, Michael., Gibbs, Martin. (2015). “Posthumous personhood and the affordances of digital media”. Mortality. Vol. 20, №4, 408–420. Routledge Retrieved from https://doi.org/10.1080/13576275.2015.1083724