Pride and prejudice: Mine is different

Minseo Jang
Cochl
Published in
8 min readAug 29, 2024

When I was young, I downloaded a game called ‘Crazy Arcade’ to play it with my younger brother at home. Before downloading, I had to check whether our computer met the necessary requirements. This childhood memory might be the first time I ever read technical documentation related to computers.

Like me, you might have found yourself jumping from one blog to another to download specific files, following instructions, or wandering through websites in search of the information you need. Sometimes, even after spending hours googling, you couldn’t find any relevant information. To avoid such frustrations, well-organized documentation is essential.

Not surprisingly, the importance of technical documentation is growing not only in the B2C sector but also in the B2B, where it plays a crucial role in ensuring smooth communication with users (clients). Here’s a question to ponder:

“What are the key aspects to consider when writing technical documentation?”

At Cochl, we’re developing a Sound AI called Cochl.Sense, which can detect sounds from the world around us. Our users can use it via Cloud API or Edge SDK. We also provide a developer center to offer technical documentation, making it easier for our users to utilize our products.

We thought “We’ve provided sufficient explanations about the Cloud API and Edge SDK, and we update our release notes whenever there are updates. This is perfect-everyone can read it.” However, we recently had to confront our pride and prejudice regarding our developer center and technical documentation.

In this series, we’ll share how we identified this issue and how we’re working to solve it. In this first article, I’d like to introduce how we recognize our pride and prejudice through a usability test.

1. Recruiting the participants

In recent years, Cochl has garnered increasing interest from people curious about our product. To engage more users and ensure they continue using our services, it’s crucial to maintain the consistent performance of Cochl.Sense and to continually upgrade it. To this end, we decided to conduct ‘usability testing’ to determine whether someone who has never used Cochl.Sense before can easily navigate and find the information they need in our developer center.

Before we began, we discussed with the development team to confirm what kinds of participants we needed and created a form to recruit them.

Let’s break down each element:

- Participant requirements

One of the most important aspects of usability testing is selecting the right participants. The requirements can vary greatly depending on the tech stack you use, the sector your business operates in, and the specific tasks you plan to assign during the usability test. Therefore, setting the appropriate criteria for participants is essential for successfully achieving the goals of usability testing.

For example, in our case, the technical documentation is divided into two parts: Cloud API and Edge SDK. This time, we want to focus on the Cloud API, so participants must be proficient in JavaScript(Node.js), Java, or Python and comfortable reading technical documentation in English.

- Process

The usability testing process can be conducted either offline or online, each with its own pros and cons. We decided to conduct usability testing offline. While this approach requires all participants to gather at the same time and place, we found that the benefits outweigh the drawbacks. Conducting the test in person allows us to directly observe where issues arise and offer immediate assistance if needed. Additionally, having all communication accessible to every team member ensures that issues are understood in their raw form, without the distortion of personal interpretation, which is beneficial for accurately understanding the problems.

Although we conducted usability testing just once, if anyone is considering a series of tests with the same participants or a large number of participants, online testing could be a viable solution.

- API integration experience

Our goal was to test the usability of Cochl.Sense Cloud API, so prior experience with API integration was crucial. In the recruitment form, we asked participants if they had relevant experience, what they consider first when integrating an API, and how frequently they refer to documentation during the process. The answer to these questions helped us establish guidelines for the tasks.

- Recruitment results

Over 20 people signed up for the usability test. Based on our set priorities, we decided to conduct the test with participants who primarily use Python. We announced the selection via email and sent follow-up text messages to ensure they checked their email and responded. Ultimately, a total of six participants were confirmed.

2. Setting the tasks

Setting the tasks should be done in tandem with recruiting the participants. If the task difficulty isn’t well-balanced, the usability test may either conclude too quickly or become impossible to finish. Additionally, if the tasks aren’t aligned with the goals of the usability testing, even if the test completes without issues, you won’t gain any meaningful feedback.

- Usability testing tasks

The primary goal of our usability testing is to determine whether someone who has never used Cochl.Sense before can easily navigate and find the information they need. We planned for the tasks to take approximately 20–25 minutes to complete.

To address the first goal, “Is Cochl.Sense Cloud API easy to use?”, we designed three tasks:

  • Install Cochl.Sense Cloud API
  • Create a new project, detect sound in streaming, and check results
  • Create a new project, detect sound in file uploading, and check results

Participants need to refer to our developer center to complete these tasks. They must click the Cochl.Sense Cloud API tab and follow the instructions in the “Getting Started’ section. This process includes signing up for the Cochl.Sense dashboard and obtaining a project key.

Through these tasks, we aim to evaluate the following:

1. Is the “Getting Started” guide well-written?
2. Is the overall process — from signing up, downloading, setting up the environment, and checking the results — smoothly connected?
3. Is there any outdated information or content?

For the second goal, “Can users find the information they need in our development center?”. we created two additional tasks:

  • Activate post action and receive an email notification
  • Control the sensitivity settings

Participants need to refer to the “Advanced Configuration” section under the “Getting Started” and “Post Action” tabs to complete these tasks. Through this, we can access:

1. Is the desired information well-presented?
2. If users struggle to find the information, what causes the difficulty?

To finalize the tasks, I drafted an initial plan and discussed it with the development team to identify any necessary additions or revisions.

3. Writing the scenario

It’s crucial to plan how a usability test will be conducted in advance. Without a clear plan, the test may not proceed smoothly, and you could easily lose track of what needs to be done at each step. We divided our usability testing scenario into three parts and organized the scripts, questions, and required time for each.

- Introduction

The Introduction is designed to ease any initial awkwardness among participants and to clearly communicate the goal of the usability testing. We start with a brief introduction of the company, followed by participants introducing themselves, and sharing their experiences with API installation. We’ve allocated 20 minutes for this introduction.

- Task conducting

This section involves the participants completing the tasks. While you could skip the “offering scenario” question, we included it to help participants become more engaged in the usability testing process. Throughout the task, we continuously ask participants which tasks they are working on and if they encounter any problems. The overall time for task conducting is 20–25 minutes.

- Sharing experiences

This might be the most important part of the usability testing, as it allows us to hear honest and insightful feedback from the participants. When drafting questions for this section, it’s essential to avoid simple yes/no answers. Instead, ask questions that encourage participants to explain why they feel a certain way. We’ve divided the questions into two categories: UX aspects (e.g., the installation process, navigating information) and technical aspects (e.g., how well each feature operates, any error messages encountered). The insights gathered from these questions provided valuable clues on how to address our weakness. We’ve allocated 20 minutes for this section.

4. The day of usability testing

After the preparation, I felt a surprising sense of bravery and confidence, even worrying that the test might finish too quickly. But reality had other plans.

Just a few minutes into the usability testing, our team was already murmuring, “Why isn’t it working?”

- Wi-Fi connection issue

With less than 10 minutes to go before the session started, our Wi-Fi suddenly failed us. It was the last thing we expected, as this had never happened before. Thankfully, the WiFi recovered on its own, but I couldn’t help but keep a watchful eye on it throughout the entire session.

- Personal computer issue

Honestly, this was something I hadn’t anticipated. Depending on the type of computer participants brought, some encountered issues right from the start. Two participants were using M2 MacBooks, and one of them faced continuous installation problems. We had to lend him an extra MacBook from our office, which caused him to take more time to complete the tasks than the other participants.

- Feature malfunction

When it came to setting up the post action feature, everything seemed to be going smoothly — until it was time to send the email notification. The process took longer than usual, and as participants kept trying to resend, the queue grew longer and longer. Fortunately, the issue was something we could address on the spot, but it made me wonder — what if there hadn’t been an engineer available to help me resolve this problem?

We documented all the feedback from the participants in a separate sheet and categorized the issues. This allowed us to clearly see the pride and prejudice we had unknowingly harbored. We reported all the feedback from the participants in the extra sheet and categorized them together. To do so, we could watch the pride and prejudice we had with our bare eyes.

A few comments & feedback

“We prioritized advancing the technology, thinking we could skip refining the technical documentation for now. If we had time, we’d get to it later.” Well, it seems that decision has come back to haunt us. What a Karma! So, how do we handle this problem?

In our next article, I’ll address the steps we took to solve these issues, one by one. See you in the next article!

--

--

Minseo Jang
Cochl
Editor for

I’ve decided that in this life, I want to be defined by the things I love — putting myself into new challenges, continuously questioning and connecting the dots