Sector: Travel
Challenge: Optimise and consolidate 3 airport lounge membership apps to have consistent functionality but brand specific front-ends
My role: Workshop facilitation, user research, interface specification, interface design, usability testing
Project time: 6 months
The Challenge
Priority Pass, LoungeKey and Mastercard Airport Experiences were all independent iOS and web apps for users to search, access and manage airport lounge memberships. When my team and I were tasked with consolidating the apps, we saw an opportunity to optimise the user experience with an efficient, collaborative design thinking process.
This is how we executed that process…
Heuristic Evaluations
My team and I hosted a heuristic evaluation workshop once per week for 3 months to establish hypotheses for good and bad user experience design across all 3 apps, competitors and similar digital experiences.
For each workshop, we invited participants from relevant parts of the business, such as product owners, customer support operators, marketing executives, developers and business analysts to capture their experience and requirements from a business perspective.
We printed out a feature for each of our 3 apps (such as onboarding journeys, searching for airport lounges, accessing digital membership cards etc.) as well as similar features from competitors or other products and stuck each screen up around the room. Along with these screenshots, We then ran through Jakob Nielsen’s 10 heuristics for user interface design and put descriptions up for reference.
Participants spent 20–30 minutes familiarising themselves with the feature and identifying points where they felt the heuristics were either present or missing. They’d then stick a length of post-it note to the area with their initials, the heuristic number and a smily or sad face for whether it was a good or bad experience point.
Next, we went through the screenshots with each participant describing their thoughts on features. There were sometimes contradictions or people changing their mind, but it gave an opportunity for all of us to empathise with other sections of the business and development process to understand their perspective of the products.
Validating Hypotheses With Data
Now we had an idea of the business’ perspective of our users product experience, could we validate these with data?
An example of a hypothesis was the that users might be unable to access photos of the lounge very easily. We looked at the way Airbnb and Tripadvisor presented photos and noticed that they were much more prominent and included more user generated content. The marketing team saw this as a big opportunity as we just had a carousel which had to be swiped to access any other images. The marketing team saw this as a big opportunity as they’d had feedback that members felt the photos provided didn’t accurately reflect the lounges.
Looking at analytical data, we noticed that of the 41% of users swiping through photos, 35% were only swiping once and immediately swiping back. We thought this must be because of the unusual placement and functionality where they were actually swiping on to a new page, so we went ahead and looked to improve this experience. This would be a metric we could use to measure success.
Collaborative Ideation
In the next workshop, we recapped the heuristic evaluation phase and presented back the data to validate some of the hypotheses.
Armed with this, the same participants as the previous workshop individually solutionised for 20–30 minutes, aiming for quantity over quality. Depending on their preference, we suggested crazy 8s, storyboards, lists of features or however they felt comfortable presenting ideas.
Once everyone had finished their concepts, each participant stuck them on the wall and had 5 minutes to present back to the group without interruption. Some ideas were similar, some ideas were very clue sky and some ideas were more conservative. In some instanced, especially if there was a large feature (such as searching for lounges on the apps), we’d do a second round of concepting for participants to consider ideas from the first round.
To narrow down the ideas, each participant had 5 votes as sticky dots to stick on their preferred designs and features. We then compiled a collection of ideas, ready to be prioritised in the next session.
Prioritising Issues
Now we had a handful of potential feature experience improvements, we used the knowledge and experience in the room to plot their perceived impact and effort.
As everyone was aware of where ideas had come from, what they might require to be built and where they fit in the bigger scheme, we saved significant time explaining them over and over and also were able to use them as a jumping off point to adjust and make easier to execute,
Defining the Scope of Work
The final set of enhancement of each feature were voted on and prioritised before being documented as areas to validate with users.
Each discovery process was completed over a couple of weeks and gave us enough considered ideas to move into the definition phase — testing with users.
Wireframe Concepts
We created low-fidelity wireframes to present back to the workshop groups and ensure we’d captured functionality correctly.
In some cases, we used these low fidelity wireframes for some guerrilla testing in airport lounges, talking users through them to get quick, raw feedback. For more nuanced improvements, we created higher fidelity prototypes.
Validating Solutions
We then started to design higher fidelity prototypes to run usability tests on our newly conceived feature improvements.
We ran think out loud tests through Userlytics with 5–7 participants per feature to gather some qualitative feedback and validate our ideas.
The results of these tests were genuinely positive and not only validated our enhancement ideas, but also that the process had been a success and ideas from within the business were well informed. In some cases, features (such as maps and an airport takeout feature) underwent further ideation before being retested to get qualified results.
Also, in a couple of cases, our hypotheses had been slightly off the mark. We thought that one of the most important parts of the apps for users was the ability to see what the lounges actually looked like through photos, however this turned out to be less important than access/entitlement information, wifi connectivity and whether the lounge served food and drink.
We ended up with over 20 tests with over 100 participants. To get through these quickly, give those who had been part of the idea think tank in the first place and make sure we captured as many insights as possible, we ran group review sessions. In these sessions, 4 or 5 of use would watch back the videos, take notes and time stamp anything we felt was important and discuss as a group. With my team and I facilitating these sessions individually, we were able to run many sessions in parallel and get through them efficiently.
All of these learnings enabled us to refine concepts and take well thought out, validated designs to the development team, who already had a head start as they’d been a significant part of the workshops and process.
Developing the Designs
The easy(ish) bit.
We had a fantastic agency, who had worked with the Priority Pass product for many years, developing for the project. LoungeKey and Mastercard Airport Experiences were new but through our previous workshops, we’d collectively understood the differences and where we could bring them together.
We ran some discovery workshops separately with the developers, talking through our feature enhancement ideas and how we should execute them to build out a solid design system framework. We agreed to use Brad Frost’s Atomic Design framework to keep stock of all components great and small and named all components in line with the Apple Human Interface Guidelines to avoid any confusion.
Researching existing systems such as Material, Polaris, Mailchimp and others, we were able to develop ‘Compass’, our own white label design system quickly and effectively. The development discovery process took about 2 months and the a further 6 months to develop the foundations.
Everything in this project was set up to continue as an ongoing (but maybe less intense) process where development would overlap with design and would all be staggered accordingly.
Impact & Takeaways
This process was new to the business but was incredibly successful. We were able to champion design thinking, inspiring other teams to host similar workshops and come to us with problems beyond user experience that could be solved with similar processes.
Some key successes of this project include:
⭐️ Streamlined a collaborative, validated design and development process
Starting internally with business hypotheses and including many participants in workshops was daunting at first, but quickly showed value when less people had to be briefed on complex feature enhancements and more people around the business were aware of ideas coming out of the workshops.
⭐️ Consolidated and developed 3 apps into a single, white label design system in under 12 months
Working from existing products with stakeholder keen to improve usability gave us the ability to maximise the value of this project. We were lucky to have the time, trust and resources to establish and execute a new process for the business and deliver exceptional results.
⭐️️️ Enabled rapid development through a considered design system
With the work we did minimising components and building a solid structural foundation of typography, spacing and documentation, we were eventually able to develop and A/B test component variations in every sprint to design the most informed user experience I’ve ever worked on.
⭐️ Started a qualitative data lake
As we were capturing and testing so many ideas, we used Aurelius to capture and share all this data. This sped up as the process as we had more and more documented insight into user behaviour to guide our design decisions.
The main success of this project was in operational effectiveness, however in all of the feature improvements we implemented, we measured and shared metrics of success, however due to the Coronavirus pandemic in 2020, airport lounge usage dropped by over 80% since 2019 and some of these features have not yet had enough time in the wild to be accurately measure.
Stay tuned for more when we truly know the impact of our work 🤘🏻