Redesigning All Trails — Trail Reviews
Design Case Study for uIDP
Here is the short version of the All Trails project if you’re in a rush…
1) The Problem
The All Trails user review feature needed a redesign to encourage more users to interact with the All Trails site to find and explore trails.
2) Who I Worked With
I worked with a team of three awesome designers: Kelsey, Skylar, and Isaac. We worked on this project for 4 weeks. We had meetings three times each week to brainstorm, critique, and collaborate. Through these meetings we were able to develop a thoroughly researched and user tested solution.
3) What We Designed
We designed a multi-step card approach that better encouraged All Trails users to review and rate trails they have hiked. By doing this All Trails users are able to share their trail experiences with other users.
The demand for our product is there; we have over 1 million outdoor enthusiasts using our products each month, but our ratings system is flawed. We don’t know what’s wrong with it, but since our product is so heavily dependent on sharing unique experiences, we’re not able to grow our user base. — All Trails
One way All Trails helps users judge the trails in their system is to allow registered users to rate the trails on a 5-point scale. In addition, the user is encouraged to write a comment, upload photos and track their hike to share with other users. The problem is that All Trails users currently do not leave many reviews and ratings. Users need to be encouraged to leave reviews and ratings for others to see on All Trails.
Research and Exploration
We started out by doing online research into other websites and apps that have review systems. Through doing this research we discovered many review and rating design trends. Some of the most inspirational ones were Yelp, Facebook’s “report spam” feature, Amazon, and Rate My Professor.
What we learned:
- Review systems online vary drastically in length depending on the amount of context needed for the person reading the review.
- Reviews are used to help users make informed decisions.
- Reviews give users a voice to share their opinions.
We found current All Trails users and interviewed them about their experience on the site. We specifically asked them questions about what encouraged/discouraged them from leaving a trail review. We also aggregated information on the type of hikers using All Trails by focusing our efforts on trails in Bloomington, Indiana; Seattle, Washington; and San Francisco, California to get a sense of what types of trails users typically reviewed and rated in All Trails.
What we learned:
- There are two types of All Trails users: avid and casual outdoor explorers.
- Avid explorers value the difficulty, stats, and beauty of the trail. They are more likely to leave detailed All Trails reviews.
- Casual explorers value the amount of time, season, and overall rating of the trail. They are more likely to leave an overall rating without giving much detail or context.
Based off our initial research insights we created two user personas to help us be better informed of who we were designing the review system for. One persona, Sam Brown, was an avid hiker and another, Jordan Harvey, was a casual hiker. Throughout our design process we continually weighed each design decision on whether or not our user personas would find our design usable. Our user personas were fluid throughout the entire design process. We continually learned more about our users from usability tests, more research, and more conversations with real users that influenced the details we put in our personas.
Our Design Goal
“The current All Trails review system is intimidating to the user and a buzzkill to the outdoor experience. Outdoor physical activity gives you an emotional and physical high. We wanted to redesign the All Trails review system in order to capture this emotional experience and encourage the users to share these outdoor experiences (good or bad).”
Early on we had a few one hour brainstorming sessions where we took all our ideas in our minds and sketched them out on paper. We learned through this process that simply writing out your ideas on paper and getting them out of your mind really sparks discussion that helps you target areas of strengths and weaknesses. Our main focus when brainstorming was how we can encourage both avid and casual hikers to tell others about their story through the All Trails review system we design.
Early Sketches and Brainstorming Ideas
Multi-Dimensional Cards vs. Static One-Page Review System
During our brainstorming sessions we found two common design solutions that we wanted to test on real users. The two designs were the multi-dimensional card and the static one-page design.
Multi-Dimensional Card Reviews — Our design hypothesis for this idea was that “users need an engaging experience that encourages them to review and rate the trails they have explored.”
Static One-Page Reviews — Our design hypothesis for this idea was that “users want to see a review system that allows them to quickly and efficiently rate and review the trails they have explored.”
We tested both of these approaches with our users to discover what they found the most encouraging and usable.
We performed user tests on twelve different users ranging in age from 20–24 years old. To ensure consistency we created a Google Form to capture our results and insights.
From our user testing we learned that users preferred the multidimensional card approach because it allowed them to focus on one review question at a time and it encouraged them to share more of their hiking experience.
A question we kept at the forefront of our minds when designing was:
“What kind of information can we gather from an avid hikers review that would be helpful to a casual hiker exploring the All Trails site?”
During our first round of usability tests we received valuable user feedback, which allowed us to make the following changes to our design prototype:
- Extra space and padding to declutter the content in each card
- Personalized text during the review process for the All Trails user
- Hover states on the rating bar to help add context to what each number (1–5) meant
- Identifying optional cards that the user could skip over if they wanted
After presenting our second iteration to our uIDP class we received more user feedback and critique regarding the numbers that identified which card the user was currently on. Users were getting confused to whether or not the numbers at the top of each card were some weird new rating system design. To solve this confusion we removed the numbers at the top and implemented a tabbed solution that helped the user identify which step of the review the user was currently on. You can see this change in our final high fidelity design below.
High Fidelity Design
Final Design Interaction
What I Learned
- Listen to your users — Talk to real users and design a solution they need, not something you think they need.
- Sketch out your ideas — Don’t talk about your design ideas. Write them down and sketch out the design flow. Doing this encourages discussion and helps you nail down specific design ideas.
- Embrace constraints — The more you constrain yourself as a designer the better. This helps you focus on a specific type of user and their needs. In the end you design a better solution that solves the needs of those users. You can’t design for everyone.