Research in the age of COVID-19

How we incentivized users to participate in research, remotely

Amber Britton
Movable Ink Brand & Design
6 min readOct 1, 2020

--

Background

Research in a B2B organization is always a bit of a challenge. In a traditional B2C company, one typically has hundreds of thousands of users to target for research at any one time. In B2B, our customers are other companies. For us, our customers are marketing teams at other companies. Marketing teams who already have busy, meeting filled days.

When COVID-19 hit the world, and when the major cities of the U.S. shut down, our team didn’t stop working. Though we did pause our research outreach out of respect for our clients who were already dealing with so much, both personally and professionally.

As we were planning work for the upcoming quarter we knew that we wanted insight from our clients on additional problems to solve. We have an extensive research database of all of the past research we’ve ever done and we have great tools at our disposal to see how our clients are using our product. But we still didn’t quite have the answers we needed to move forward on a particular project related to surfacing additional performance insights. So, we started looking at the tools we did have at our disposal. We then brainstormed how we could attempt to get some data both quickly and easily.

One of the most powerful tools we use at Movable Ink is our analytics event tracking software. While we use this tool extensively for tracking usage metrics across the different features of our product, it also has tools for messaging content in-platform. Specifically, messaging within our own web application directly to our users. This is something we had never really utilized.

Designing The Survey

We consulted the many resources for survey design, and collaborated closely with our incredible Senior Design Researcher, Hannah Graffeo, to finalize our list of questions.

We started by aligning on our goals for running the survey, asking ourselves these key questions:

  • What do we want to learn?
  • How do we plan to act on this data?
  • Who is the target audience?
  • How can we segment the target audience?
  • What kind of questions should we include?
  • What tools will we use?
  • What are our recruitment options?
  • How long do we want to run the survey?

We knew we wanted to keep the survey short to learn the most from our clients while respecting their time.

Based on our goals, we settled on six required multiple-choice questions that would help us segment our audience and four optional short-answer questions that would allow the respondent to expand on any of their answers.

We also included an option for respondents to let us know if they would be interested in participating in future research. It’s always amazing when clients want to participate in creating products, so we kept our fingers crossed that more clients would want to talk to us.

GIF
source: GIPHY

Building the Survey

Now that we know how we wanted the survey designed, we had to decide where to place it. We took a look at our most visited pages related to analytics, and settled on two high traffic areas around overall analytics and self-service reports. Those became our target pages!

We also knew we wanted to hear from our clients directly as opposed to internal employees. We were able to target only the pages in question, and only non-employees, which meant it would only show to our client users.

At first, we tried a simple message design. Any client user that visited one of our two main analytics pages in our software would get the following message.

First iteration of survey
Our first iteration

Once they dismissed it or clicked through, they wouldn’t see it again. The prompt was focused on taking a few minutes (5 specifically) to fill out a survey.

Deploying the Survey

We were so excited to try this and see if we got any responses. Every day we eagerly refreshed the dashboard to see how many users were served up the image. A few, 30, 100! The numbers kept going up. But there was one problem. Not one person clicked it.

We thought something might be wrong. We used our session monitoring tool to watch a few sessions where the message popped up. Sure enough, the users saw the message, closed it, and continued with what they were doing. We were devastated.

GIF
source: GIPHY

Iterating on the Survey

We took a minute to have a pity party and then went back to work. We realized that our analytics tool also had the ability to A/B test the message, and hold control. So, we tried again. Our theory was that maybe 5 minutes sounded too long. And in all honesty, maybe it wouldn’t actually take 5 minutes. So we tried three different message types. The control was our previous message, and then we had two new versions that we tested.

First iteration of survey
From left to right: Control, Variation A, Variation B

Again, we watched and waited… and refreshed. Again, no one clicked. UGH!

This time we did not get discouraged. We were determined to get someone to engage! We circled the product team wagons for ideas and decided to try to incentivize users. We have incentives to help with large research projects that take more of a user’s time. So we broke it down into a smaller incentive — $5 for a few minutes of time. We also continued with our A/B testing method to see if one set of language resonated more than another. We went live with three new versions — one that called out the incentive in the copy and two which called it out in the headline.

From left to right: Control, Variation A, Variation B

Within 24 hours, we had our first response!

GIF
source: GIPHY

Results

We added the message to two of the pages within our platform. In total, the message was sent 335 times. The survey was opened 15 times. We received 11 responses:

Results: the message was sent 335 times. The survey was opened 15 times. We received 11 responses

Almost every respondent opted into an additional research call with us. We were able to reach out to every participant and talk to them about their reporting needs and pain points in much more depth.

What Did We Learn

We were able to learn a bit more about what clients do with the analytics from our platform after they get them. We also were able to document other tools that they use for analytics along with more insights about their workflow, generally. We also discovered a few areas of our platform that we realized did not have intuitive actions that translate into quick wins for our team to iterate on.

This was a great way to reach out to clients and meet them where they are in the platform to ask for feedback while respecting their time. We were able to get a quick pulse check on our clients' needs and increase our pool of research participants at the same time.

We learned that A/B testing is crucial to ensuring your message is resonating with users and getting the appropriate level of engagement. Incentives don’t hurt either.

We will definitely do this again in the future but might do a few things differently.

GIF
source: GIPHY

--

--