Design for Tension

陈典
7 min readNov 29, 2018

--

Team member: Dian Chen, James Petullo, Myles Spencer, Bezawit Ayalew

Chatbot is generally a chat robot, a computer program that simulates human conversation, or chat, through artificial intelligence. Typically, the chatbot we use most often in normal life is the conversational interfaces such like Siri and any QAFs in online store. They are able to achieve only one motivation which is solving customers question. If you ask them something not related to what they are designed for, the chatbot can’t understand what people are saying. For this design for tension, we are going to create a chatbot to convince people to believe something, which will be a controversial topic.

Our main topic is Engineering & Humanity. First, we determine the motivation of our design which trying convince students that humanity course is important both in their study of whatever majors and revealing their personal values even they are studying in engineering. Based on our motivation, our user group is students who are confused on their majors or why they need to take humanity for college requirement, especially for engineering students. Their expectations for our chatbot are learning the benefits of humanity and finding the courses which most suitable for them according to their personal skills and values. At the end of the conversation, we expect that every user is able to find their ideal courses for humanity and be interest in those courses.

Our team works really good, which we decided our main topic together and how our conversation flows. After brainstorm, James created the frame of chatbot and me and other team member provide the dataset to support it. When James finished the chatbot, all of us tested the chatbot before the Demo day and made some change on it. I believe our team works efficiently.

Brainstorm & Context

In the process of brainstorm, we start to design that how our conversation are going to flow. At the beginning of the conversation, we want to know whether the user already has a major, which will lead to two different branches. Basically, if the user chooses yes, we will ask whether this major is humanity. When they say yes, it means they are already interested in humanity which will go to the end of the conversation. If they don’t have a humanity major, we are going to ask them how interested they are to take humanity course from scale 1–10. We divide this to 3 different branches, which 10 expresses the user are totally willing to take humanity which is just our motivation, so the conversation goes to the end; 6–9 means the user is not sure whether they are going to take humanity, maybe because they don’t know what to take and the benefits from taking courses. Therefore, we provide the available courses in their college along with the benefits; for users who enter 1–5, it means the user are not willing to switch to humanity, which we would like to ask their values and provide them how they achieve and present their values through humanity; also, if they want, the chatbot are able to search the possible humanity courses in their university. Back to the beginning, if the user don’t have a major, we will ask the skills in order to match an ideal major for the user. When they get their ideal major, we will repeat the above step about the humanity.

For context, because my team member prefers to use python, we are able to create a different context and interface from message or slack. Then we create a html page with our chatbot which the main color of the chatbot is orange having a positive and energetic feel. Moreover, we set the conversation bubble of our chatbot called ‘circe’ to orange and the conversation bubble of users’ input to grey in order to make a contrast during the conversation flow.

flow chart for how the conversation flows

Final prototype

After we finish the prototype of the chatbot, we did the test in our own team, which provides us a new approach for the chatbot. For my perspective, in our first vision, the conversation is stopped after searching the humanity course for users, which is suddenly end the conversation. While I’m doing it, I don’t realize the conversation is ended, which we change the order of displaying the responses and add the ‘Goodbye’ at the end.

main process of our chatbot

On the Demo day, we have six users to test our chatbot that they give us really useful feedback and advice. Most of the users are curious on the question that asking school to find the humanity course for them, which they are wondering whether we have all of the university in the US. The truth is we do have it, but they can’t go back to that question to give a different answer. Therefore, we need add that to enable users to go back to any step they are interested in. Moreover, after we provide them the course they have in the specific college, the users are trying to check the details for those courses. However, we provide the result in text instead of actual link. The most interesting phenomenon and feedback is that the users took a really long time on the question “what are some things that you value?”, which they think this question is too ambitious and broad confusing them about what to answer. Overall, most of the feedback is positive, which the users feel our chatbot is relatively useful during deciding the major and the process of choosing the humanity courses. Also, because our chatbot is design for personal, each user can find specific result about what they want. We provide the benefits for learning humanity not in a way that puts everything in front of users, but a more targeted way which we provide different benefits for different users based on their own skills and values. I think this is the key element why we design is successful and persuasive for most of users, which depends on that the action and response of our chatbot derive from analyze the situation of each user and provide them with reasonable feedback.

users are testing our prototype

Summary for testing

Feedback:

Advice:

1. Add a question that whether the user want to continue the conversation and enable them to go back at any point in the chat

2. Provide the link with the result of course searching

3. “value question” is too ambitious and broad

4. The typing area is not clear enough

5. With emoji to make it more attractive

Positive reflection:

1. Helpful for user who is confused on how to choose humanity course

2. Persuasive enough for the user who is totally not interested on humanity, because they learn which humanity course they can take based on the their values

3. The context is clear and it’s great that we have separate the color of the conversation bubble between users and the chatbot.

Improvement:

1. Add the link for result information

2. Changing the order of the ending sentence to make it clear enough( already did)

3. Enable users to continue or restart the conversation at the end(already did)

4. Change the word or the way of asking question

5. Add emoji along with the conversation

Conclusion

For this design of tension, I believe we are quite successful which convey the information for users with personal and specific aspect. Result in testing prototype provides plenty of advice to enable us to improve our design constantly. If I have another chance to redo this design, there are something I would like change but can’t achieve in this case. For our dataset of values and skills, because we did hand-type in Json, we didn’t have enough time to create a huge amount and more powerful one. We are going to find a more efficient way to achieve this motivation next time. Also, we will design a more pretty and attractive context , and put emoji inside the conversation. Overall, I believe our chatbot works really well, which almost every user has a great experience through chat with it and receive a useful advice for their humanity courses.

This the video about our chatbot with important conversational aspects:

https://www.youtube.com/watch?v=YINs1woaNVo

--

--