Design for Tension — Designing a Chat Box for Friends of Depressed People
Summary
In this Design for Tension project, we created a chatbot that attempts to tackle a tense topic. We chose to pick a concept that centers around depression as it is a tough area to deal with for those who are directly affected by it and those affiliated with the affected person. In order to narrow down the target group, we decided to aim it at teenagers who are concerned for their friends struggling with depression. We built the chatbot as a solution to answer the following question:
How might we build a bot to spread awareness to teenagers who are concerned for their friends struggling with depression?
Our overarching goals were to provide informational advice for these young adults about certain aspects of depression, what to do in particularly serious situations, and answer their questions. Below is the demo video of our final product:
Brainstorm/Ideate
The first step we took in this project was to figure out the topic we wanted our chatbot to be based on. We brainstormed a number of topics and listed its pros and cons as shown in the pictures below:
Ultimately, we chose to go with the depression concept. Initially, we imagined this chatbot to be a comfort service for young adults dealing with the mental illness. However, we decided it would be more interesting to implement it as a training service for friends of affected people on a premise that we can spread more awareness about depression and provide help for people in a position who don’t know what to do in some situations. It would be located on a website for depression, in a tab “resources for friends.”
In addition, we wanted to collect some accurate information to provide the end-user so we conducted some research on the web. We looked into symptoms and other sources to recommend such as over-the-counter medications and psychologists. To effectively convey this information and make the bot more personable, we created a personality for the bot and named it Omni. Moreover, we talked to each other about how to make the conversation seem more natural and performed some user testing mentioned later on to test this aspect.
Along the way we made some assumptions about the user as the problem was open-ended. We assumed the bot wouldn’t be talking to the actual person with depression, the user already has some knowledge about what this chatbot is for, and the friend is looking for help for the depressed person.
User Intents
We wanted to take into account some basic interface design rules such as allowing people to feel in control of the situation and making their options clear, which are especially important for our topic and target group. To elaborate on this, we established some key user goals that we wanted to address in our project:
- Provide information and advice to spread awareness of depression to teenagers.
- Help users learn about what to do and how to act in serious and difficult situations.
- Determine sources to recommend to help the affected individual.
- Inform how to be supportive of kids struggling with depression by telling users what to say.
- Answer any questions the users may have about depression.
- Make the user feel comfortable with talking to the bot.
Key User Inputs
We determined some keywords that we thought the user might enter and the bot should know how to respond to. These are organized in the following categories:
- People: Friend, Family Member, Psychologist, Professional Help
- Feelings: Depression, Sad, Sleepy, Trouble, Emotional, Distant/Withdrawn, Stressed, Alone, Failing, Isolated
- Alarm: Help, Emergency, Suicidal, Self harm, Symptoms, Bullying/Bullied
- Other: Seasons/Winter, Thanks
Each word is meant to be used as a trigger word to get a response from the chatbot. For example, the alarm category words would inform the chatbot that the situation is serious and it will prompt the user to call the suicide prevention center.
Design Flow
After brainstorming and collectively putting together all of the information we gathered, we created a flow diagram to develop the various conversations the bot could have with the user.
While developing this flow chart, we touched upon the user intent goals we had in mind. One of the things we wanted to implement was to make this chat server completely anonymous to make users comfortable in speaking to the bot. In addition, we planned to make the majority of the questions based on a yes or no answer so that users could get faster service and information, especially during crucial times when a depressed person is contemplating suicide. The keywords would come in handy when the user could ask their own questions. We did want to keep some sort of interactive aspect to the conversation to keep it more natural.
Formative User Testing
As mentioned above, we wanted to test how natural the conversation with the bot would be and areas to improve so we conducted some formative user testing. We got a lot of useful suggestions and ideas on areas we could refine our bot. Below is a list of this advice:
- It isn’t that clear what the user is supposed to type in response to the first message form the bot and took a couple tries in order to trigger the correct response
- Make the input more consistent — use either free responses or buttons
- Would be better if it were more natural, but on the right track with making a personality for the bot
- Bot is informational but more information/interactiveness could be done
- The yes/no format is good
- Good concept of making target group as friends
- On the free response questions maybe tell the user that the bot is expecting either a yes or no answer for most of them
Final Product
We used Flow.xo to create our chatbot. We chose this platform because we were all able to collaborate and it was, for the most part, easy to use through its straightforward construction environment. The built in delay and typing awareness dot features helped to create a natural conversation between the bot and user. We did encounter some challenges as it was difficult to prompt the chatbot to answer user questions as it was so open-ended that any question could be asked. Although we were not able to implement exactly what we were thinking, we were satisfied with how we captured the important goals for assisting friends of people struggling with depression.
Our process for making the chatbot is highlighted below:
We have labels on the side panel for the response choices the user could give us, such as the options, as well as the types of responses the chatbot would give back, such as the types of information. We also took into account the user comments from the formative user testing session to keep the user responses consistent by incorporating button answers. Below are some pictures of some important conversational aspects with our chatbot:
Our chatbot, Omni, is triggered when the user types in “Hi” or “Hello”.
There are three options the bot poses to the user to ask how long their friend has been struggling with depression. Each option provokes different responses, information, and advice from the bot. Below is an example of choosing the “Less than 2 weeks” option:
If someone is concerned for their friend who has been struggling with depression for less than 2 weeks, Omni does not quickly say that they need immediate help. Instead, it customizes the answer to advise that this is probably not a concern as it is too early to tell and the person could just be having a bad day. The bot provides helpful information about depression symptoms to look out for, points to other useful sources, and expands on what to do if a serious situation arises.
An example of choosing the “Under 6 months” option:
The user mentions that their friend is in a serious situation and is feeling suicidal. In this circumstance, Omni responds empathetically and immediately prompts the user to call the Suicide Prevention Center with the number it provides. It also helps that it is personable and easy to talk to in this difficult situation. Additionally, Omni proceeds to give further information about depression and therapists to recommend to their friend.
An example of choosing the “Over 6 months” option:
In addition to its uses, Omni also provides specific professional and medical help when the user says that current methods their friend is taking to cope with depression are not working. Again, Omni gives supportive advice to the user and tells them things to say during this tough time along with other useful information.
Feedback
We had our demo day for this project on 4/10 where we demonstrated our design to various other students in our HCI class. We got a lot of comments on where the design went well and where we could improve using the “I Like, I Wish, What if” format. Students liked the concept and nature of the problem, having a casual conversation instead of searching the web and thought it was well-implemented, providing buttons for the options, gave emotional and resourceful support, the bot was informational, the different types of advice depending on the options. Some areas students thought we could work on were using more options for the buttons, having the bot jump right into the topic instead of being more conversational, and expanding on more mental illnesses in addition to depression.
Looking Back
In the future, there are areas where we can refine our design for the chatbot. We could have more options that the user could choose to direct the conversation. We could also have free responses where the user can ask their own questions to learn more information about depression. Moreover, we can briefly introduce Omni and then jump right into giving the user information about depression and what their friend might be facing. We can also provide details, such as email and phone number, on how to contact mental health professionals instead of just the link to their website. Furthermore, we can expand our project in the future to not only handle depression but also other illnesses such as bipolar disorder.
One of the important things to note is that some person who has depression could be using our chatbot because they want to get as anonymous help as possible. They could be pretending to be the friend, but inputting their own symptoms. This gives us a responsibility, as designers, to watch the wording of our sentences and exactly what we say, as to not make them feel worse about themselves. Also, if we had more time we would have implemented more into the flow having a discussion about anxiety vs depression, and how they can both be connected.