Way.com Parking Check-in/Check-out Process

High-Fidelity Prototyping, Usability Testing, Comparison Testing

Alice Leung
A.Leung Designs
5 min readJan 27, 2019

--

Overview

Way.com is an eCommerce business that connects both local vendors and consumers with five categories: Parking, Dining, Movies, Events and Activities. It is a responsive website and has an iOS and Android app. One of the main issues affecting revenue is the confusing process of checking in and out when selecting dates and times in the Parking category. As Way.com is a 50-person startup with no UX expertise, I’ve led a design team of two through this usability/comparison testing efforts.

One of the design ideas for Way.com’s Parking Check-in / Check-out Process

Challenge

With a bounce rate of ~40% in the Parking category, my team and I were responsible to understand why the bounce rate is high, and also to figure out a design solution to reduce this bounce rate by 15%.

My Role

I was the product manager and part product designer in the company. I’ve led this project to figure out why and how we had a 40% bounce rate, and leverage interaction design as well as usability comparison testing efforts to reduce the bounce rate to a more business acceptable level, 25%.

Solution

This initiative was carried on by my design team after I left the company, and it has successfully reduced the bounce rate from 40% to 25%. A better experience of the parking check-in/check-out process was implemented after a series of iterative designs and usability testing.

Background/Problem

Bounce rate for the parking vertical flow is around 40%, and we identified 3 main reasons through Google Analytics as well as customer feedback:

One of Way.com’s hotel booking page with parking type, check-in and check-out elements.
  1. Prices are inconsistent between Google Ads, listings from search results, and parking detail pages — marketing-related.
  2. Parking lots are not available for check-in and check-out dates/times that users selected — business/operations-related, design is secondary.
  3. Users getting confused when selecting check-in and check-out dates/times during the checkout flow — design-related.

Assumption: New/first-time users are affected the most and mainly the ones contributed to the 40% bounce-rate.

My team and I focused on Reason #3 as it was design-related, while the Marketing and Business/Operations team handled Reason #1 and #2.

Wireframe/Prototype Designs

As this project is in a fast-paced startup environment, the tradeoff was to test these designs in high-fidelity visual design layouts.

I’ve created three high-fidelity designs with different layouts: Design A, Design B, and Design C:

Design A: A step-by-step process in the booking panel under the images.
Design B: A top to bottom fixed right-sided booking panel with check-in/check-out date dropdowns.
Design C: Fixed right-sided booking panel with calendar displays of dates and a scrollable time selection.

Interaction design in this case was, in fact, affected by visual design because participants will be drawn to certain elements based on what attracts them first (i.e. pictures, star ratings, number of reviews, etc). My team and I had kept this in mind before we dived into our usability comparison testing sessions.

Usability Comparison Testing

We conducted a moderated testing using InVision prototypes, and had 6 participants in this study. Some of the qualitative feedback gathered was: 1) navigational path, 2) error details, 3) what did they find confusing, and 4) what did they find useful. Below are some the metrics we decided to measure:

  • Task completion
  • # of errors
  • Time on task
  • Ease of use (scale from 1-very hard to 5-very easy)
  • Satisfaction rating (scale 1-not satisfied to 5-very satisifed)

Results

For Task #1’s use case, I created a straightforward task for participants to 1) choose a parking type, 2) Select the desired check-in date and time, and 3 Select the desired check-out date and time.

Comparison results for Task #1: A straightforward task of booking a parking spot

Design C was least likely for participants to make errors, and has the highest average rating of satisfaction. Participants also quickly finished the check-in and check-out process on an average of 19 seconds.

For Task #2’s use case, I’ve added an element of complexity when a participant selected a check-in date, but later changed their mind and selected a different check-in date. Check-out date and time remained the same.

Comparison results for Task #2: Changing your mind on the parking check-in date/time

It turned out that Design C still performed the best overall, even though it had a higher percentage of participants making an error. This was because the overall satisfaction was still much higher compared to those of Design A and Design B, and the average time on task was comparable to Design B (which were both the fastest).

For Task #3’s use case, this is similar to Task #2, but instead of participants changing their mind on the date(s), the participants are changing their mind on the parking type. This is intended to test the interaction of how the order of content is shown (i.e. step-by-step or display all in one screen).

Comparison results for Task #3: Changing your mind on the parking type

In this particular use case, Design B performed the best. No participants had made an error, and the average ratings of ease of use and satisfaction is comparable to Design C. The biggest winner here was that the participants finished the booking the fastest at an average of 25 seconds.

Key Takeaway

  • Design C, even though in the use case of Task #3, did not perform the best, it was also considered as an edge use case, compared to those of Task #1 and Task #2 (which happens more frequently). The interesting part about this usability testing is that, Design C brought an element of delight and originality to all of our participants.
  • Users want history of what they chose previously, since they might not always remember what they selected.
  • Need to make calendar easier to read — text is too small and crowded.
  • Users want flexibility in being able to edit/change what they chose locally/easily.

What we learned: The order of testing may have affected user behavior/feedback because users may get more familiar each time they tested the same task but different designs.

Recommendation: Randomize order of testing for Design A, B and C.

Next Steps

Iterate the design (Design C) based on usability testing feedback from Design A, B, and C, and conduct usability testing for Design D.

--

--

Alice Leung
A.Leung Designs

Strategic Product Design Leader who loves solving complex challenges and learning new things.