Digital Research Prototype Fail

Shane Strassberg
Shane-IxD-Thesis
Published in
3 min readNov 16, 2016

After designing, testing and iterating on a research prototype to test an assumption of unconscious implicit bias against women as a low-fidelity paper prototype, I wanted to create a digital version to test how it might be received without my presence as a person takes it, and also to collect a large amount of quantitative and qualitative data.

I chose to make it with Typeform, an online survey maker that is very easy to use to create simple or complex surveys. I used the same cards I used in my paper prototype and was able to choose a random filter that would present each participant with a different line-up. After making their choices on the three different categories of astronauts, journalist and computer programmers, I asked them to explain why they made their choices. When I did this in person their was an element of control that allowed me to ask follow-up questions and have more of a discussion. For this digital version, it meant that I had to go with whatever answers they gave. That proved to be very frustrating.

I also did not anticipate how the experience of taking this survey would be different from the web to mobile version. Below you can see how the view is different and how that might affect the experience of taking it.

Typeform Web Version
Typeform Mobile Version

Typeform has a great metrics page breaking down completion rates, time to complete and devices used. It was very informative of why I feel this digital prototype was a failure.

Typeform Metrics

The first thing that stands out is the low completion rate, and the second is that participants were primarily taking the survey on a smartphone. That is when I viewed the survey on my own smartphone and realized that this was not the optimal way to take this survey. It really limited how you view the cards to one at a time, which I believe reduces a person’s ability to make proper comparisons. When it was taken as a paper prototype, participants were able to have all the cards laid before them and make multiple comparisons, which I think allowed a person to choose more closely to their personal ideals.

I also believe this to be true based on some of the answers given about why people chose the cards they picked. Below are the answers collected on Typeform:

As you can see, a majority of participants chose randomly, but also felt very hindered in their ability to choose based on how the survey was presented to them. Also, it is clear that without my physical presence, many people did not take the survey seriously.

So, where to go from here?

I think I need to create a digital version that allows a participant to view all cards at once and be able to move them around as they would have done on an actual table. Also, I would build in a timer as I was timing people with the low fidelity version. I may still be able to be involved when someone takes this test online by using Google Hangouts so I can ask follow-up questions.

What I’ve learned so far through this, is that using a digital platform is not always going to make things easier and presents certain challenges that might not be anticipated. In order to save time in the future, it would be better to conduct some research on how to design a digital solution to a test like this one.

--

--

Shane Strassberg
Shane-IxD-Thesis

Marine Corp Vet + Anthro Grad+ Interaction Design Student+ Small Forward