UX Case Study: SeedLogix Website Builder & Content Management System Prototype

Josh Jennings
4 min readJan 10, 2019

--

TL;DR

SeedLogix, a startup company in Austin, TX, was looking to improve their website builder and content management system (CMS). I conducted and recorded 4 user testing sessions in the CMS and analyzed each of the sessions for user errors. The results were used to create a prototype design that was tested with a 5th user who was able to successfully complete all instructed tasks.

Test Results:

  • Project: UX Testing and Prototyping
  • Company: SeedLogix/IgnitedLocal
  • Date: December 2018
  • Location: Austin, TX

The Challenge

SeedLogix was looking to start scaling their marketing and web design program by providing access to their online CMS platform to freelance contractors and other marketing agencies. The platform provided a website builder and online marketing tools so that a user could execute all of their online campaigns through one system.

Since the system was so new, it quickly became clear that users were having trouble with certain aspects of the website builder. In order to identify user errors and outline potential solutions, I initiated a UX design project involving several rounds of user testing, user error analysis, and a prototyped solution with Adobe XD.

User Testing Sessions

To begin the project, 4 user testing sessions were scheduled with a group of web designers, graphic designers, UX designers, and website developers since these would be the most likely personas utilizing the website builder system. Once the project was complete, a 5th user testing session would be performed with a new prototype design to evaluate its improvement.

Five users were chosen to hold the tests since “the best results come from testing no more than 5 users and running as many small tests as you can afford” according to the Nielsen Norman Group. This hypothesis was confirmed to be true as the 3rd and 4th users continued to show similar errors to their previous counterparts, and the majority of errors were identified within the first 2 sessions.

During each session, the users were instructed to complete 5 vital tasks to the creation of a website:

  1. Create a new website
  2. Successfully navigate a website touchpoint
  3. Add HTML code to the home page
  4. Upload an image
  5. Create/view new internal pages

In order to maintain bias-free results, each user was prompted to complete each task without instruction, unless called for in order to keep the test moving forward.

Each of the user sessions was recorded to be reviewed after the interviews. I took notes on each user after the interview and organized the data into a spreadsheet. The errors were categorized by:

  • The location it occurred
  • The task that was being attempted
  • The type of error
  • The number of users that made the error
  • The ease to solve
  • The potential value

Using an average formula between the ease to solve and the potential value of the solution, I also created a priority stack of the user errors to allow the developers to tackle the most crucial elements first.

The Prototype

Once the user testing and analysis was complete, I designed new layouts for each screen of the CMS based off of the recommended solutions in the user testing results.

The design changes that were made included:

  • Adding a welcome module to explain the meaning of “Campaign”
  • Adding quick access buttons to create different touchpoints (see above)
  • Displaying tab names on touchpoint navigation (previously hidden under hover effect)
  • Changing names/icons for several tabs in navigation
  • Changing ellipses to a plus icon for adding blocks
  • Adding an “Upload Image” button to each page layout

Last, each of the screens was linked into a clickable prototype to be tested with the final user and potentially iterated upon for future tests.

Interested in seeing the prototype design?

View Clickable Prototype or Download Prototype PDF

The Results

In the final user testing session with the new prototype, the user was able to complete all tasks with much more ease and accuracy than users on the previous version, as indicated in the chart below.

While the initial user testing sessions took between 20:00 and 35:00 minutes for attempting the tasks and answering questions, the final testing session video recorded was only 6:29 in length.

These results indicated that the new design streamlined the user flow process, reduced confusion, and minimized the potential for user errors to occur. In addition, several new observations were recorded to be tested in future designs.

--

--

Josh Jennings

I'm a web designer and digital marketer in Austin, TX. I’m a hardcore believer in trying new things… even when it doesn’t make any sense.