Bot as Research Tool
Conversational Symbiosis Amongst Humans and Artificial Agents in the Context Romantic Relationships Study
This study was the start of my research into how conversational interfaces can support conversational symbiosis amongst humans and artificial agents in the context of romantic relationships. I hoped this study would allow me to better understand how comfortable individuals are with such an interface, the possible affordances of such an interface, the possibilities for feedback in such an interface, and possible integrations of such an interface.
This study was an essential first step to drive forward future research and design. It included a user interview and walkthrough.
For the first research procedure I took a participant through a 30 minute user interview/walkthrough. The user interview/walkthrough unfolded in two stages:
- The participant was introduced to a scenario (the participant’s partner had made plans for them without asking them about those plans beforehand) and told to imagine that they are in that scenario and to message back and forth over a provided messaging tool (apple) that I prototyped for the purposes of that study. While messaging, the participant was told to talk through their interaction (i.e. What is working? What is not working?)
- The participant was asked questions to follow up on what was said during the earlier messaging activity. For the activity, I used Quicktime to record the screen of the messaging tool, so that I could analyze the interaction later. Interviews were completed in a private room so that no passerby was inadvertently recorded. In addition, I took notes on paper during the interview.
For this study, I designed and built apple. apple allows you to simulate conversations you could have or have had with your partner. With apple one can build a greater understanding of their partner and relationship through conversation.
apple was designed for an individual to simulate a conversation based on a topic that could become an argument between that individual and their partner.
apple functioned as a SMS bot via Twilio. Each conversation consisted of four participants: the user, the apple bot (introduces you to apple and provides help), a simulated partner bot (named Chris), and a mediator bot.
A user is first introduced to the purpose of the bot and the scenario the user is supposed to imagine themselves in for the purposes of the simulation. Once the simulation begins, the mediator suggests a basic framework for conversation (based on Pangaro’s CLEAT model) and provides relationship advice (based on research, literature, etc…).
The bot utilizes Dialogflow to understand what users are saying, if they are successfully addressing the CLEAT model, and determine what advice is the most relevant.
How Apple Works
In the scenario I created for apple, four different individuals are involved. apple the bot. Taylor, which is played by the user. Chris, a bot that plays Taylor’s partner. And also the Mediator Bot, an objective, non judgmental, accepting, and thoughtful third party.
When users first message apple they are introduced to the bot through apple the bot, what it does, and how a user can trigger those options.
Users are told they can “simulate two types of conversations. Type 1 to have a conversation with a simulated partner. Type 2 to include a mediator bot” and that “to learn about what else you can do, type assistance.”
Once a user goes through the initial setup, the user is taken into the simulation and told “You are now entering an alternative world. Your partner is just about to text you about the event next Saturday.”
Once in that world, a user talks to their partner. The user and Chris go back and forth for a short amount of time, before Chris (their partner) asks to include the mediator bot. The mediator bot introduces themselves and the four stages of conversation (sharing phase, exchange phase, evolution phase, response phase) I mentioned before.
The mediator bot then facilitates a productive conversation between the two parties. It utilizes strategies from intimate relationship literature, to provide scientifically proven advice, which in turn builds credibility for the mediator bot.
By the end of the conversation, a user is able to reflect on their own conversations, see how strategies mentioned in the chat could be potentially used and where they have potentially made mistakes with a partner in the past.
They then have the ability to redo the simulation or choose from a number of other simulations (these has not been developed).
What I Learned From apple
Through a conversation on apple, a user is able to reflect on their own conversations, see how strategies mentioned in the chat could be potentially used, and where they have potentially made mistakes with a partner in the past.
Still, there were a couple issues I confronted when making the bot.
1. The timing of the texts. Ideally I would be able to replicate the timing of a real conversation, but I could not figure a way to do that with Twilio.
2. Inability to differentiate individuals. Below, you see a more optimal version where users would be able to easily scan and differentiate different individual’s messages. Today, I used a technique inspired by screenwriting to differentiate roles.
3. Struggled visualizing the stages of conversation. I struggled with the visualizing these stages with the tools and the personalization they allow me in creating a bot. Ideally, a user would be able to see exactly where they are in a conversation, what that need to do, and what they have already achieved.
Regardless of those issues, apple served as a probe to answer a number of research questions pertaining to my thesis. It can also serve as an artifact that allows a user to take advantage of the effectiveness of simulations. By picturing a successful action, one is able to enhance their ability to make decisions in the future.
What I Learned From This Study
I learned a lot from both designing apple and this study that will inform future designs. These insights included:
- Artificial agents employed in an intimate context were more accepted than expected.
- Tools should employ (a) clear frame or frames so that users can establish realistic expectations of that tool.
- A tool’s level of intervention, mode of activation, and level of integration should be influenced by the frame or frame(s) employed by that interface.
- A unique set of frames is necessary to address different forms and kinds of conversation.
- Users lack awareness of relationship frameworks, tips, and strategies.
- Successful interfaces rely on an awareness of a situation. The contextual awareness of an interface should be influenced by the frame or frame(s) employed by that interface.
- An agent should take a neutral perspective when in the presence of both partners. An agent should take other perspectives depending on the context of that situation.
- Tools should not allow users to become over reliant on them.
- An agent’s visibility should be dependent on the flow of a conversation.
- Data use should be visible to all users.
- The form of an interface should be influenced by the frame or frames employed by that interface.
- Artificial agents have the potential to provide a place for reflection, an outside perspective, a guide for conversation, a calming presence, an instrument for detecting sentiment and specific pieces of language.