Your Website’s Navigation is Probably Wrong

Matt Eriksson
6 min readApr 8, 2020

--

‘How hard can it be? It’s just a website and there are millions of examples of good navigation out there. Why not just use one of those proven examples, right?’

Sure, it may seem like a straightforward job so why spend time and money doing research and testing? Perhaps the urge to copy a competitor’s IA seems like an easy way forward but how do you know that they have done their research and then implemented it correctly? More importantly, your users are most likely very different from your competitors’ and possess different needs. Do you know if you have one user group or several? How do you service them most effectively?

That’s where research, testing and analysis will help you find the best way forward. Here I’ll outline how I generally approach the task of overhauling the IA and navigation of a website to help improve the user experience.

AUDIT

To begin with, it is important to understand what works and what doesn’t with the current website. To do that you need to become an expert on the product you’re working on. My suggestion is to approach this as a traditional heuristic evaluation. Analyse the UI in detail and highlight any UX issues which may be present. You’re likely to uncover outdated information, broken or inaccurate links, clunky navigation elements, duplicated content and a whole host of other problems. Often many of these issues get introduced over time as content is added to a site and changes are made. I generally keep track of each issue, the URL and its severity in a spreadsheet.

In addition to gaining familiarity, findings during an audit serves two more purposes;

  • Highlighting issues impacting user experience and which may be the cause of problematic navigation.
  • Develop a series of assumptions which can be validated or clarified as we proceed
Website audit, discrepancies and highlighted issues
Website audit, highligted issues and navigation discrepancies.

If one doesn’t already exist, I prefer to create a sitemap of the current live site and nay proposed new pages using Gloomaps. This can easily be updated and shared with your stakeholders as you move forward.

GOOGLE ANALYTICS

With a good understanding of where the key issues within your IA lies, your next step should be to delve into Google Analytics if it is set up. This tool will help add further clarity to how people are using (or not using) the website. By, in particular, exploring the Behaviours tab you will be able to gain a range of valuable insights.

Google Analytics will provide valuable insights.

Find out if there are particular junctures in the user journey where we see huge drop offs? Are there particular pages or sections which draw attention while others go unnoticed? Consider these as context for what you want to explore in the next phase; card sorting. For example, on a recent job I did for a large organistation I uncovered unexpected user groups based on acquisition and a series of pages which were rarely accessed despite being crucial to the service on offer. Ultimately I needed clarity on whether the navigation was the culprit behind this.

CARD SORT

First of all, consider what labels you really need to test? Based on your research and the sitemap, start by gathering your labels and make sure they are clear enough for your users to understand. If there’s confusion as to what a label means it may impact the full study and indicate that this ambiguity also has an effect on the website.

Next, I created an open card sort in Optimal Workshop. This would allow participants to name their own categories. I suggest keeping the card sort to a maximum of 35–40 cards as too many cards can take a long time for the participants to sort through and utlimatley lead to abandoning the study.

A great feature of Optimal Workshop is that you can print the cards and also conduct a moderated test by printing the cards. On the occassions I have had luxury of having access to a team of designers and developers I have done this. The benefit of doing an in-person test is that it can provide valuable insights before proceeding with a wider, unmoderated test. If there are any issues with the labels you have a chance to make changes at this stage. It is easy to scan or enter the printed cards back into Optimal Workshop.

Moderated card sorting study.

When you feel confident about the cards you should provide clear instructions for how to conduct the sorting to the participants. There’s a lot of information on Optimal Workshop’s website how to do this but don’t forget to include your own specific instructions. As an example, on one recent job I had to make it clear that the organisation was moving from a one event a year approach to a year-round service. This information would influence how participants sort and make decisions on categories.

ANALYSIS

With a small group of participants your results may be easier to interporet but alos less reliable. In a recent test I had a total of 48 participants who completed the study which resulted in 52 categories. That’s a lot of categories but upon review you’re likely to find that many categories are similar or overlap.

Depending on what you’re testing and how many participants you have Optimal Workshop’s different analysis tools may prove more or less suitable. In my case, when having a substantial number of respondents I have found that the Dendrograms are most useful. It may take a while to go through and you may need to correlate data across different analysis methods to uncover patterns and be able to whittle the data down into real insights. I often use the Similarity Matrix and Participant-centric Analysis to validate and clarify the findings in the Dendorgrams.

Optimal Workshop offers several ways of analysing your data.

Even with a good understanding of how users would organise your content, it is highly likely that you’ll still have a few outstanding issues. This can be labels which participants have struggled to sort or categoiries which are not clear. It is important to keep these issues in mind as you proceed as there is an opportunity to test these during Tree Testing.

TREE TEST

With a new IA unveiled by the card sorting exercise I update the sitemap to reflect any findings. Next, it is time to put the new, proposed navigation to the test. Tree Testing will allow you to confirm if the navigation works well by giving the participants a series of tasks to perform. Again, Optimal Workshop has a great guide on how to effectively formulate your questions and analyse your data.

Tree jack study alongside questionnaire and updated sitemap.

Hopefully, your participants will have managed to navigate the IA effectively, but if not, this is the chance to make any final adjustments. Find out if the users caan complete the tasks efficiently and in a timely manner or if there’s any ambiguity in the user’s journey. Make any amends before handing off to your stakeholders to implement the new navigation.

CONCLUSION

Again, spending the extra time to conduct this kind of research and testing will pay off in the long run and I can almost guarantee that you will uncover issues and findings that will surprise and help illuminate your way forward. After all, what’s the point of having a great service or brilliant content if your users struggle to access it?

This process should be repeated on a regular basis and alongside any before redesign to ensure new issues are not introduced. Any time you add a page, a label or change a navigational item you should ensure you look at how it impacts on the full user experience.

Finally, it is worth noting that every project is different and may require a slightly different approach. Also, when conducting user-centred research and testing it is essential to also consider business requirements and brand needs.

Thanks for reading.

--

--