Case Study → Visitor Surveys

Generative research to understand customer behaviour

Matt Thomson
Shopify Hacks & UX Ponderings
12 min readDec 30, 2016

--

Update!

I’ve recently revisited this project and entirely redesigned the survey to give more useful (and easily collated) information and also be responsive.

This was built by modifying the standard Shopify contact form code to add extra fields including likert scales (https://en.wikipedia.org/wiki/Likert_scale).

This updated survey can be found at — www.mattt.com.au/survey

TL;DR A case study documenting the setup and analysis of an online visitor survey for my Shopify store using Wufoo and Pixelpop. The current version of the survey can be found here → www.mattt.com.au/survey

Online survey popup on website home page.

Summary

This case study documents a generative research project I undertook to better understand visitors to my website over a four month period in 2016.

  • Open-ended questions were generated to focus the research and identify key trends, actionable outcomes and areas for further enquiry.
  • Data collected from an online survey was combined with Shopify order data and summarised in Excel.
  • Basic statistical analysis was undertaken with SPSS Statistics to better understand relationships within the data and support observations.
  • 133 survey responses were collected over the four month period.

The Challenge

As a designer, maker and business owner I’m in a unique position of having a personal, ongoing relationship with many of my customers and have often relied upon informal conversation, support requests, direct feedback and observation of customers in my retail space to gather insights.

While feedback such as this can offer some fascinating, isolated insights and are valid ways to better understand my customers, they are also somewhat disconnected interactions lacking quantitative data to give further context.

The purpose of the project therefore was to better understand overall customer attitudes and behaviour (and if this differed between customer segments) in the following three key areas:

  • Communication with customers
  • Website features, what customers liked and felt was missing
  • Impact of the incentive on conversion

Constraints

Like any project there were a number of inherent constraints that needed to be acknowledged.

Technical

Lacking the ability to code a bespoke popup/survey myself solution I opted to use a combination of existing apps to implement the survey. Although this was cost effective it did introduce several technical compromises.

These apps were:

Pixelpop App

A new (August 2016) app from Pixel Union designed specifically for displaying popups and banners on Shopify stores.

Admin area of Pixelpop app with live visitor data for the online survey.

The design features of the app to create popups are quite robust (if not a little fiddly to style) though far superior to equivalent apps in the Shopify App Store.

The popup display (when and where they are shown to the visitor) is somewhat basic and required custom code so that it only displayed on the home page and a desktop browser.

Code added to display survey popup only on home page

While this wasn’t a long term solution, website analytics show a minimal number of orders had historically been placed from mobile devices, limiting impact on conversions (however this did exclude mobile device use as a research parameter).

UPDATE — In the latest update of Pixelpop there is now the option to automatically display popups as a banner on mobile in response to Googles new policy on mobile interstitials/popups (view on Pixel Union blog).

Wufoo

Having used Wufoo for several years on my website I was very familiar with it’s features, limitations and how to best integrate it into my Shopify store.

Wufoo’s very straightforward offering.

As simple conditional logic can be easily added to Wufoo forms I was able to target the surveys, displaying targeted questions to specific self-identifying customer segments.

A key technical limitation Wufoo shared with Pixelpop was the difficulty of styling the forms to function on mobile devices, in particular when conditional logic is used to toggle questions.

However as the survey popup was not being displayed on mobile devices (see above) the impact of this limitation was minimal.

Another significant limitation of Wufoo is the use of iframes to embed forms and subsequent handling of error messages for users. This makes it difficult for forms to be completed by users with screen readers or other similar assistive devices.

💡 Due to these limitations I use a modified version of the inbuilt Shopify form for critical website functionality such as contact forms etc.

Social Desirability Bias

Based upon previous informal conversations with customers in my retail space I suspected Social Desirability Bias would be the main bias present.

This is due to the existing relationship with my customers and being generally hesitant to admit when they found the website difficult to use, or blamed themselves rather than the website for this deficiency.

To minimise the potential impact of this bias I used neutral language throughout the online survey along with clearly delineated opportunities for both positive feedback and constructive criticism.

A section of the online survey.

What I Did

As this was a generative research project the process was by its very nature quite flexible and somewhat open ended.

The strategies I used can be summarised as:

Online surveys

The surveys were hosted on a dedicated page (www.mattt.com.au/survey) which was linked to the Pixelpop popup.

As website visitors opted in to completing the survey and were offered a substantial incentive ($25 off their next order) I suspected they would be answered thoroughly.

This assumption was supported by the length and detail of the free text responses.

💡 To minimise decision making fatigue (which might impact their ability to select, customise and purchase a bag after completing the survey) I ensured the high-value qualitative questions were in the first half of the survey and also limited quantitative demographic questions (such as age and gender).

Wufoo summary of excluded feedback survey responses from August

The initial response to the survey was encouraging with in-depth responses, a spike in conversions and reports of several technical problems.

As substantial refinements were made to the survey as a result I have excluded the survey responses from the 25th to 31st of August 2017.

An added benefit being that all data was now for complete months, making analysis and comparison of this data more accurate.

Survey responses also created opportunities for me to contact the respondents to address specific concerns (such as not being able to find key product information), which resulted in several additional orders.

💡 Technical limitations of Wufoo would have been mitigated by simplifying the survey or having a custom JavaScript form with conditional logic coded.

Shopify Data

To give further context to the survey responses I combined these with associated Shopify order data.

Example of customer orders filtered by discount code SURVEYV4QSC

This allowed me to better understand the magnitude of the responses in relation to total website traffic and suggest relationships between the discount incentive and average order value etc.

💡 Assigning survey responses to customer details was a manual process of matching email addresses and adding customer tags. If this were able to be automated it would be far easier to track ongoing behaviour and generate further insights, particularly to better understand specific behaviours of customer segments (new, return and fanatic).

Digital affinity mapping

Digital affinity mapping was used to group and recode the free text responses of the combined quantitative and qualitative data.

As the survey data was already available in a CSV file this was collated and combined with the other data sources data within Excel and as required exported to SPSS Statistics for basic statistical analysis as appropriate.

SPSS Statistical Analysis

Statistical analyse was completed using SPSS Statistics to calculate adjusted residual values (ARV) to better understand behaviour relating to the monetary incentive.

Due to the low frequency of most other qualitative responses it was difficult to perform statistical analysis without grouping these variables to create a more meaningful data set.

As this would then reduce specificity in response types I decided it was best to revisit this when I had collected a larger dataset (more survey responses).

Example of survey results imported into SPSS for analysis showing Customer Segments vs Favourite Features.

Results & Findings

A key assumption (confirmed by the research data) was that behaviours and attitudes would be distinctly different between these self-identified customer segments.

These responses have therefore been considered both individually and collectively.

💡 Segmenting this initial four month dataset over time (behaviour per month etc) resulted in very low frequency counts for the majority of responses. With a larger dataset it would be possible to undertake this analysis and understand customer segment behaviour changes over time.

Communication

As a small business owner ensuring you are effectively communicating with your customers is crucially important.

Q → How would you prefer to ask a question?

  • Email is the preferred contact method for all customer segments to ask a question, particularly with ‘repeat’ and ‘fanatic’ segments.
  • Live Chat was a significant way to ask a question with ‘new’ customers.
  • This suggests that my policy of predominantly answering customer questions via email is appropriate, though Live Chat and Phone should also continue to be promoted as options, particularly with the ‘new’ customer segment who may be uncertain about my response time and prefer immediate communication.

Website Features

This was perhaps one of the most complicated sets of results, being collated from two free text questions,

Q → What information do you need/want to help choose the perfect bag?
Q → What is missing from the website?

As multiple responses were possible from each survey (a total of 235 responses) and the ‘new’ customers weren’t questioned about missing features (as they were yet to use the website) I’ve only drawn broad observations about these results.

  • The ‘repeat’ and ‘fanatic’ segments were the most happy with both wanted (at least 36%) and missing (59%) features suggesting the website did what they required it too.
  • Combining the three photo categories made these the second most reported wanted and missing website features across all customer segments and therefore possibly the highest priority feature to next add to the website.

Impact of incentive on conversion

Predicting customer behaviour around discounts and incentives can be difficult when relying upon a single quantitative response (such as whether the code was redeemed as shown below).

For this reason I decided to generate a number of simple metrics from both the online survey and Shopify data to see if this might provide more nuanced insights.

Discount code redemption was the first metric analysed, simply collected from Shopify orders made with the discount code.

As customers redeemed the discount code with both online and retail orders (through Shopify POS) this also made it possible to understand the interaction of location and customer segment.

From the data above while there does seem to be a larger proportion of the ‘fanatic’ customer segment redeeming the incentive through the POS, as this was only three orders (from 20 responses) I would be hesitant to draw conclusions.

Comparing average order value was the next metric calculated and compared the average value of orders with the discount code to orders placed without the incentive.

💡 As there was only a marginal difference within this metric for “location” this wasn’t included in the results below.

Over the four month period there were the following redeemed orders (average order value includes the discount):

When compared to the remaining orders (those placed without redeeming a discount code) the 103 remaining orders had an average value of $188.

From this simple comparison it can be observed that:

  • Average redeemed order value was $37 greater

Time between survey and redemption was the final metric I used to understand the impact of the incentive on conversion.

This combined the online survey and Shopify data to establish if orders were placed the sameday as the survey was submitted.

Rather than using an average value for this metric, I found it more useful to total the number of sameday redemptions to understand this behaviour.

From this results it can be observed there were too few same day orders to infer much about customer behaviour and that this should be revisited with a larger dataset.

The few observations that did seem reasonable were:

  • Online Orders 43% of the repeat customer segment placed a sameday order, suggesting this segment were more likely to visit the website more than once before placing an order (more data might show this to be the case with the ‘fanatic’ segment).
  • POS Orders It was somewhat surprising that 30% of sameday POS orders were placed (as the survey wasn’t available to complete in-store). This suggests it is not uncommon for customers to check the desktop version of the website directly before visiting the retail location (as the survey didn’t display on mobile).

Due to the low frequency of the three metrics it is difficult to infer detailed customer behaviour in regards to incentive redemption; however some very general findings are:

  • Approximately a third of customers redeemed the incentive;
  • These customers spend an average of $37 more per order, or 50% more than the $25 incentive value;
  • To understand the true impact this had on the profit margin of these orders analysis of COGS and inputs such as shipping would need to be considered; and
  • Overall, 50% of orders (21 of 42) were placed on the same day as the survey was completed.

Learnings & Next Steps

The online survey responses collected together with associated Shopify data from this four month period generated actionable insights, several areas for future enquiry and supported a number of key assumptions.

Limitations of dataset

As stated throughout the case study many of these observations and attempts to predict behaviour across customer segments and locations require a larger dataset to establish statistically meaningful validity and reliability for the results.

This limited dataset also made it difficult to predict changes in month-to-month customer segment behaviour.

Linking data to customers

A key insight from the project was the value of matching online survey responses to the Shopify order data.

While this was a manual process, by tagging orders and customers when orders were processed, customer segments/groups could then be automatically filtered within the Shopify admin and exported, automating future analyse of this data.

This would be all the more powerful if it were possible to automatically link these survey responses with associated order data within the Shopify admin.

💡 As the majority of customers purchased bags from design markets or retail locations prior to 2014 (when Shopify POS was launched), this approach can’t accurately analyse my historical customer orders from these locations.

Incentive

Whilst not previously offered to visitors of my website the incentive seemed to be a powerful driver of both survey completion and increased average order value.

Further experimentation with the format of the incentive (either as a percentage of the order value or a small “free” physical product) might be worthwhile to maximise its value to the customer.

Although technically difficult, offering a targeted incentive to each of the customer segments could also optimise the impact of the incentive further.

Survey refinement

Perhaps the biggest learning from this project was the tremendous value of having a well considered, thoughtfully designed and well executed online survey.

Refinements I will be implementing in future versions of the survey include:

  • more robust technology to manage the survey, potentially custom coded to display and perform equally well on all versions of the website
  • directly aligning the questions with the required behaviour being investigated
  • A-B testing a physical product as an incentive vs a monetary discount code
  • an automated email reminder to customers that haven’t redeemed the incentive within an allocated time frame
  • a complementary post-delivery survey to obtain feedback and further insights from customers after they had received their order

Final thoughts

The process of setting up an well executed online survey while quite involved does reward this effort with a fantastic amount of insight into customer attitudes and behaviour.

It is also a rich source of data to inform future product roadmaps and can help suggest and measure practical improvements to a business.

--

--

Matt Thomson
Shopify Hacks & UX Ponderings

Maker of bags • Designer/UXer/Implementer of Shopify • Melbourne, Australia