Custom Web Development in a Crunch
Our annual end-user conference (this year, part of VMworld 2017) brings together thousands of enterprise mobility enthusiasts, partners and customers. It’s the perfect opportunity to not only collect valuable insights, but also to observe firsthand how users interact with our products. To motivate attendees to provide feedback, we offered custom avatars at Connect 2016.
In my previous post, “The Avatar Experience,” I explained how our user experience (UX) team conceived and developed a custom avatar production process. However, before going down this path, our team needed to determine what kind of data to gather and the best format for doing so.
We faced several limitations, including booth space and face time with attendees. We wanted in-depth feedback, but could only collect it during limited session breaks or in a way that allowed participants to complete on their own time. We also needed to define use cases and the types of users who should participate. We considered many factors as we set out to find the optimal solution.
In this post, I’ll discuss our UX team’s collaborative effort to develop, test and roll out a dynamic, interactive feedback website at the conference. We began with a simple brainstorming session, revealing two distinct approaches to collect feedback from IT admins, which were:
- Provide survey questions on printed cards, categorized by persona type so participants could choose which categories they identified with most. Attendees could turn in completed cards for a custom avatar.
- Provide a digital survey on tablet devices through an online link.
Using physical cards, we could directly interact with different types of IT admins, but we decided this low-tech approach would be too cumbersome both at the event and when it came to analyze findings. So, we decided on a hybrid approach, tying in our avatar incentive.
Everyone loved the idea, but even on a small scale, creating a new website within a compressed timeline would be no easy feat, on top of our team’s already heavy workload. We had weeks to pull it together, instead of months, so we rallied to make our idea a reality.
As with any design process, we began gathering requirements and answering a series of critical questions, including:
- What does the site need to accomplish and how?
- What kind of data do we want?
- What questions do we need to ask participants and how should we categorize these questions?
- How should we engage and incentivize participants?
- How do we get this data into a consumable format?
Because our timeline was tight, we answered some of these questions as we built the site. We primarily focused on our users and the process they would need to complete in order to receive their avatars.
Ensuring that they understood exactly what we were asking them to do every step of the way was paramount. To ease them into the workflow, we needed a page that would introduce them to the concept and what to expect. The more transparent the experience, the easier it would be for participants to complete.
[Related: Inside the Mind of a UX Designer]
We also focused on the type and breadth of data that we wanted to collect. We defined specific demographic data we needed to obtain, as well as detailed information about our participants’ profiles, such as their industry, device types and counts.
We started with the basic functions that the website needed to provide participants, including:
- A quick overview of the task at hand.
- A mechanism to gather basic data about each user.
- A simple way to capture and upload their pictures to create custom-made avatars.
- A simple way to display and answer survey questions.
Our Senior UX Architect, Drew Malcom, developed a responsive framework so participants could easily access the site on both laptops and mobile devices. Our Design Strategist, Kyle Barry, defined how to port the data collected into a consumable format that we could analyze down the road. Then, we created a complete infrastructure, from hosting the site on our servers to engaging users on the site once their feedback was submitted.
We also developed a system so our visual design team could track custom-made avatars, including who created them and when they were sent. From here, our team defined the user flow and how all the elements worked in concert from screen to screen.
While the exact list of questions we wanted to ask customers was not yet identified, we knew there would be a significant number. A card user interface (UI) pattern was a handy format to iterate upon because it allowed for multiple functions within one container.
Once the site was up and running, we moved to testing. The team identified internal participants and facilitated a series of usability testing sessions. We recruited participants from departments across the organization to best simulate the real-world perspectives of IT admins on both laptops and mobile devices.
Testing proved extremely valuable because it helped us uncover a variety of usability bugs, such as image preview functionality. We discovered our image upload function displayed the wrong image, confusing users.
And, we identified a variety of UI issues. Users weren’t able to differentiate between the various question types because our color selection and icons translated too similarly, and our validation states were not clearly conveyed. Findings like these led to refinements such as the development of question tag naming conventions, so users clearly understood what each meant and what to expect. Our findings not only allowed us to quickly iterate on and develop various refinements to our website, but also enabled us to see just how receptive users were to the experience.
[Related: Breathing New Life into UX Research]
Thanks to an iterative process and tight collaboration, our team moved nimbly from conception and design to prototyping, testing and refinement. We created a thoughtfully designed, fully functional website in just weeks. Our solution successfully met our needs by providing an easy way to engage attendees at Connect and gather valuable feedback data at our booth, outside the event and even after it concluded. Perhaps best of all, the data we collected supplied us with real-world, actionable insights.