We Don’t Need No Stinkin’ Code: Testing Software Requirements

Karl Wiegers
Dec 7, 2020 · 7 min read
Image for post
Image for post

Someone once asked me when you can begin testing software. “As soon as you’ve written your first requirement, you can begin testing,” I replied. It’s hard to visualize how a system will function by reading some requirements. Tests that are based on requirements make the expected system behaviors more tangible. Even the simple act of designing tests reveals many requirements problems long before you can execute those tests on a running system.

Requirements and Tests

Tests and requirements present complementary views of the system. Creating multiple views of a system — written requirements, diagrams, tests, prototypes, and so forth — provides a much richer understanding of the system than can any single representation. Some agile development methods emphasize writing user acceptance tests from user stories in lieu of writing detailed functional requirements. Thinking about the system from a testing perspective is valuable, but that approach still leaves you with just a single representation of requirements knowledge, so you must trust it to be correct.

In essence, a requirement says, “Here’s what the system should do under these conditions.” A test asks, “How can I tell if the system is doing what I think it should?”

Writing black-box (functional) tests crystallizes your vision of how the system should behave under certain conditions. Vague and ambiguous requirements will jump out at you because you won’t be able to describe the expected system response. When business analysts (BAs), developers, and customers walk through tests together, they’ll codify a shared vision of how the product will work and increase their confidence that the requirements are correct.

A personal experience brought home to me the importance of combining test thinking with requirements specification. I once asked my group’s UNIX scripting guru, Charlie, to build a simple e-mail interface extension for a commercial defect-tracking system we had adopted. I wrote a dozen functional requirements that described how the e-mail interface should work. Charlie was thrilled. He’d written many scripts for people, but he’d never seen written requirements before.

Unfortunately, I waited a couple of weeks before I wrote the tests for this e-mail function. Sure enough, I had made an error in one of the requirements. I found the mistake because my mental image of how I expected the function to work, represented in about twenty tests, was inconsistent with one of the requirement. Chagrined, I corrected the defective requirement before Charlie had completed his implementation, and when he delivered the script, it was defect free. It was a small victory, but small victories add up.

Conceptual Tests

You can begin deriving conceptual tests from user requirements early in the development process. Use the tests to evaluate functional requirements, analysis models, and prototypes. The tests should cover the normal flow of each use case, alternative flows, and the exceptions you identified during elicitation and analysis. Similarly, agile acceptance tests should cover both expected behaviors and exceptions.

Consider an application called the Chemical Tracking System. One use case, “View a Stored Order,” let the user retrieve a particular order for a chemical from the database and view its details. Some conceptual tests are:

  • User enters order number to view, order exists, user placed the order. Expected result: show order details.
  • User enters order number to view, order doesn’t exist. Expected result: Display message “Sorry, I can’t find that order.”
  • User enters order number to view, order exists, user didn’t place the order. Expected result: Display message “Sorry, that’s not your order. You can’t view it.”

Ideally, a BA will write the functional requirements and a tester will write the tests from a common starting point, as shown in Figure 1. Ambiguities in the user requirements and differences of interpretation will lead to inconsistencies between the views represented by the functional requirements, models, and tests. Those inconsistencies reveal errors. As developers translate requirements into user interface and technical designs, testers can elaborate the conceptual tests into detailed test procedures.

Image for post
Image for post
Figure 1. Development and testing work products derive from a common source.

A Testing Example

Let’s see how the Chemical Tracking System team tied together requirements specification, analysis modeling, and early test-case generation. Here are some pieces of requirements information, all of which relate to the task of requesting a chemical.

Use Case. A use case is “Request a Chemical.” This use case includes a path that permits the user to request a chemical container that’s already available in the chemical stockroom. Here’s the use case description:

The Requester specifies the desired chemical to request by entering its name or chemical ID number. The system either offers the Requester a container of the chemical from the chemical stockroom or lets the Requester order one from a vendor.

Functional Requirement. Here’s a bit of functionality associated with this use case:

  1. If the stockroom has containers of the chemical being requested, the system shall display a list of the available containers.
  2. The user shall either select one of the displayed containers or ask to place an order for a new container from a vendor.

Dialog Map. Figure 2 illustrates a portion of the dialog map for the “Request a Chemical” use case that pertains to this function. A dialog map is a high-level view of a user interface’s architecture, modeled as a state-transition diagram (see my article “Improve the User Experience with Dialog Maps”).The boxes in this dialog map represent user interface displays (dialog boxes in this case), and the arrows represent possible navigation paths from one display to another.

Image for post
Image for post
Figure 2. Portion of the dialog map for the “Request a Chemical” use case.

Test. Because this use case has several possible execution paths, you can envision numerous tests to address the normal flow, alternative flows, and exceptions. The following is just one test, based on the flow that shows the user the available containers in the chemical stockroom:

At dialog box DB40, enter a valid chemical ID; the chemical stockroom has two containers of this chemical. Dialog box DB50 appears, showing the two containers. Select the second container. DB50 closes and container 2 is added to the bottom of the Current Chemical Request List in dialog box DB70.

Ramesh, the test lead for the Chemical Tracking System, wrote several tests like this one, based on his understanding of how the user might interact with the system to request a chemical. Such abstract tests are independent of implementation details. They don’t describe entering data into specific fields, clicking buttons, or other interaction techniques. As development progresses, the tester can refine these abstract tests into specific test procedures.

Now comes the fun part — testing the requirements. Ramesh first mapped the tests against the functional requirements. He checked to make certain that every test could be “executed” by going through a set of existing requirements. He also made sure that at least one test covered every functional requirement.

Next, Ramesh traced the execution path for every test on the dialog map with a highlighter pen. The yellow line in Figure 3 shows how the preceding test traces onto the dialog map.

Image for post
Image for post
Figure 3. Tracing a test onto the dialog map for the “Request a Chemical” use case.

By tracing the execution path for each test on the model, you can find incorrect or missing requirements, improve the user’s navigation options, and refine the tests. Suppose that after “executing” all the tests in this fashion, the navigation line in Figure 2 labeled “order new container” that goes from DB50 to DB60 hasn’t been highlighted. There are two possible interpretations:

  • That navigation isn’t a permitted system behavior. The BA needs to remove that arrow from the dialog map. If you have a requirement that specifies the transition, that requirement also must go.
  • The navigation is a legitimate system behavior, but the test that demonstrates the behavior is missing.

When I find such a disconnect, I don’t know which possible interpretation is correct. However, I do know that all of the views of the requirements — textual, models, and tests — must agree, so there’s clearly something wrong.

Suppose that another test states that the user can take some action to move directly from DB40 to DB70. However, the dialog map doesn’t contain such a navigation line, so the test can’t be “executed:” you can’t get there from here. Again, there are two possible interpretations:

  • The navigation from DB40 to DB70 is not a permitted system behavior, so the test is wrong.
  • The navigation from DB40 to DB70 is a legitimate function, but the requirement that allows you to execute the test is missing.

In these examples, the BA and the tester combined requirements, analysis models, and tests to detect missing, erroneous, or unnecessary requirements long before any code was written. Every time I use this technique, I find errors in all the items I’m comparing to each other, quickly and cheaply.

As consultant Ross Collard pointed out, “Use cases and tests work well together in two ways: If the use cases for a system are complete, accurate, and clear, the process of deriving the tests is straightforward. And if the use cases are not in good shape, the attempt to derive tests will help to debug the use cases.” I couldn’t agree more. Conceptual testing of software requirements is a powerful technique for discovering requirement ambiguities and errors early on.

=================

This article is adapted from Software Requirements, 3rd Edition by Karl Wiegers and Joy Beatty. If you’re interested in software requirements, business analysis, project management, software quality, or consulting, Process Impact provides numerous useful publications, downloads, and other resources. Karl’s latest book is The Thoughtless Design of Everyday Things.

Analyst’s corner

We are passionate about helping businesses do their job better!

By Analyst’s corner

A curated collection of recent stories from the Analyst's corner. Never miss an insightful article worth reading! Take a look.

By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.

Check your inbox
Medium sent you an email at to complete your subscription.

Karl Wiegers

Written by

Author of 12 books on software, design, management, consulting, and a mystery novel. Guitars, wine, and military history fill the voids. https://karlwiegers.com

Analyst’s corner

All aspects of organisational analysis: business analysis | enterprise architecture | quality

Karl Wiegers

Written by

Author of 12 books on software, design, management, consulting, and a mystery novel. Guitars, wine, and military history fill the voids. https://karlwiegers.com

Analyst’s corner

All aspects of organisational analysis: business analysis | enterprise architecture | quality

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store