Taming the content monster

Samantha Campbell
Designing Atlassian
5 min readDec 12, 2022
A monster made from a cloud of printed words stands silhouetted like a menacing shadow against a white background
Illustration by P. Dickison

How to audit your content with UX heuristics

I froze for a minute. Audit the what? A content hub with 282 pages manually? Err…sure, let’s do it!

“A content audit gives a holistic view of what’s working and what isn’t.” (Claire Brain, Head of Copywriting, Boom Online Marketing)

If you’re not familiar with content strategy — it’s the planning, measuring, and governance of content to make sure it fulfils user goals and meets business objectives. Content can get out of control quickly (aka the content monster) and content audits are a large part of every content strategist’s life. I know, I know, it’s not the most sexy sounding work but I personally love it. There’s something curiously exciting about how the ‘content elements’ intercept with the ‘people elements’.

I’m a content strategist at Atlassian, and this audit was done to assess the quality and value of an internal content hub used by around 70% of Atlassians. The content hub is hosted in Confluence and made up of roughly 80% written content, 15% visuals, and 5% videos.

Here’s how it went…

Planned out the steps

There are always many ‘moving parts’ in a content audit. Even more so when running a manual audit without the luxury of typical website auditing software to generate lists, reports, and crawl links. I find that using a structured plan helps to keep me on top of all these details.

We have an awesome internal ‘ways of working’ framework that we use at Atlassian, which creates space for discovery before jumping into delivery mode. I used this framework to help me break down the work into steps.

A timeline showing week by week progress
Content audit timeline and steps

Step 1: Kicked-off and set goals

We all know preparation is everything. Right? Right??

With any content audit, the first and arguably most important step is to define clear goals for why you’re doing it. In this case, the team wanted to refresh an existing content hub (Confluence space) to address user feedback and align with future business objectives. The team wanted to know ‘more’ about their content. So, we sat down together to figure out what ‘more’ meant, defined goals, and discussed what to do with the learnings.

Example of a goal matched to an objective:

Identify most read pages ▶️ Know if content is relevant

Step 2: Got to know our users

We used this time to get to know our users, define problems, and discuss the value that any potential changes could bring. We validated our decisions with recent user research and then collaboratively agreed on the below hypothesis before proceeding with the next step.

Our hypothesis:

‘If we better understand who our customers are and how they use our content, then upgrade the quality of our content based on these findings, we can improve relevance, engagement, satisfaction, and ultimately influence the adoption of our product.’

Step 3: Explored tools and scoring criteria

I looked into what tools we could use and what metrics we had available for measuring success.

For the dataset, I exported the raw space data using Confluence analytics (Spaces), and later built out the spreadsheet further with fields that we needed (see Step 4).

For the performance metrics, I needed scoring criteria. I searched online and found many resources but they were either too complex, too simple, or just not fit for our purposes. So, I decided to create my own content scoring matrix based on and inspired by Nielsen’s usability heuristics. Nielsen’s heuristics are well researched and widely accepted in the industry and I find these principles just as applicable to content as they are to design.

My content scoring matrix contained 4 main sections:

  1. Usability heuristics
  2. Content guidelines matched to UX heuristics
  3. Scoring categories/criteria collated from the guidelines (Value, Readability, Discoverability)
  4. Metrics aligned to the categories/criteria for future tracking
A table with four columns showing how to match content guidelines to usability principles.
Content scoring matrix

Step 4: Ran the inventory and evaluated the content

I added extra columns to the ‘raw’ spreadsheet with data to suit our specific needs including things like ‘page purpose’. I then reviewed each page according to the scoring categories and criteria defined in the previous step (Value, Discoverability, Readability). Scoring against these categories and criteria helped me stay objective, and I used a Likert scale type rating system to give me a Total Quality Score.

Categories and criteria:

  • Value = purpose is clear, content is relevant, visuals are suitable, CTA is present, goals are met
  • Discoverability = navigation is clear, title is relevant, keywords and labels are used
  • Readability = page hierarchy is clear, copy is skimmable, length is appropriate, language is jargon free
A Google spreadsheet with coloured columns showing example data
Content inventory spreadsheet

Step 5. Pulled the insights and aligned with goals

This next step was all about making sense of the data. It’s always my least- and most-favourite part of the audit. It’s hard, and sometimes tedious, but so satisfying as you get to know ‘more’ about your content. Things like identifying page purpose, finding out what’s being read (or not read), and uncovering areas to improve are just so valuable to providing a better product or service.

Example of a goal matched to an objective matched to a metric:

Identify most read pages ▶️ Know if content is relevant ▶️ Page views

Tamed the content monster (for now)

Once I finished running the audit, I presented the findings back to the team to highlight key metrics, performance, and insights.

Examples of insights:

  1. Only 8% of the content received over 80% of the total views.
  2. The content quality scores of many of our most visited pages were medium to low.
  3. Many of our pages had a mixed purpose thus confusing the reader.

I’m happy to say that we’ve all thoroughly embraced these findings and are currently rebuilding our information architecture to make sure it meets our users’ goals. This effort is being supported by exploring user experience journeys and stronger personas. We also identified gaps in our content governance processes and feedback mechanisms that we want to address. Phew, so much more work to do!

But honestly, auditing your content and aligning what you create or produce with your goals on a regular basis — will help you, like us, keep that ‘content monster’ tamed!

“Quality, relevant content can’t be spotted by an algorithm. You can’t subscribe to it. You need people — actual human beings — to create or curate it.” (Kristina Halvorson, CEO, Brain Traffic)

--

--

Samantha Campbell
Designing Atlassian

A content strategist aka content monster tamer. I’m a writer, dreamer, traveler, lover of clouds, and chaser of quiet.