The Testing Role in Agile — An Overview

Kate Falanga
7 min readNov 16, 2017

--

I have decided to make some posts outlining some concepts that I have received questions about over the past few years. I’m not much of a blogger but I have found documenting makes it easier to have productive conversations.

I would like to acknowledge that wording and terms can get contentious in the testing world. There are different schools of thought on testing which disagree on certain terms and define things differently. I encourage those reading to try not to focus too much on the words you may find disagreeable and try and look at the bigger picture.

A place I thought to start was a high level view of an idealized Agile software development process which includes a dedicated testing role. A role typically referred to as QA. I call this idealized but this is a process that I have put in place and have seen work successfully on numerous occasions. I’ll stay high level and dig deeper in certain areas based on any feedback.

The Kick Off

This is a meeting with everyone involved in the project. This can include sales, marketing, project management, product owners, developers, architects, visual designers, user experience designers, and QA. The goal is to give everyone an idea on not only what they plan on doing but why. A review of business goals that have driven the decisions that have been made so far is essential. It’s important for someone in a testing role to understand business goals in order to get a better idea of what “quality” means for this project. They can then work with their team to help create an environment in which quality can occur.

If this is an ongoing project that has already started it will be beneficial to have periodic meetings with the same purpose of alignment on goals and updating everyone as businesses shift in response to market changes.

The “Definition of Done”

It’s important for teams to define what it means for a User Story to be done before they actually start working. However, a DoD can be updated later if the team decides it’s necessary. A DoD can include things like:

  1. All Merge Requests were approved by two developers prior to deploying to QA Environment
  2. Unit Tests have passed
  3. Acceptance Criteria has been met
  4. Functional Tests have passed
  5. Associated bugs have been fixed (unless otherwise determined by the Product Owner)
  6. Product Owner accepts the User Story

There are many different versions of a DoD but what I find to be extremely important is that a User Story can’t be considered done unless it has been tested AND any bugs that have been uncovered have been addressed. Having the team agree to a DoD that includes testing and bug remediation means that the team has accepted the shared responsibility of creating a quality product.

Sprint Planning

This should be a group activity for a team and can be very difficult for the testing role. This is where the you see a lot of the switch from a TOLD role to a TELL role and that transition can be hard for everyone.

Since the Definition of Done includes testing it’s important for the testing activity to be estimated as part of the overall User Story estimation. That means QA needs to be part of the conversation. They need to ask questions on acceptance criteria and give feedback if something isn’t testable or doesn’t seem right based on business and product knowledge. Developers tend to discuss the difficulty of implementing a piece of functionality and that is important to know. If a developer is worried than someone in QA should make a mental note to talk to that developer after the meeting to understand why. It might mean more testing is needed or that they should partner with the developer earlier in the process to help.

Some things are easy to code and hard to test and vice versa. It is important for the team to know that level of effort without derailing the meeting. That can be a tough balance. It can also be hard to speak up if a tester feels a User Story doesn’t include enough testing time. However, not speaking up means more crunch time down the road. Teams that don’t estimate in testing efforts are commonly the teams that acquire technical debt by passing stories with bugs due to time constraints or not completing User Stories in Sprints.

I also suggest that teams take on Stories of different sizes during a Sprint in order to stagger when User Stories are passed onto QA or Product Owners. This helps alleviate the end of Sprint bottleneck when all stories need to be tested and accepted by Product the day before a Sprint ends. The more a team works together the better they are at estimating as a group so good team dynamics and communication are important for all roles.

During the Sprint

Tracking

Part of working as a team is everyone using the same tool with all the information in one place. Typically this is a tool such as Jira but there are many similar ones. It’s helpful for teams to use a board either online in the tool or physically which tracks the status of each User Story. It’s common to have at least three columns such as:

To Do | In Progress | Done.

However, while it’s typically frowned upon in some circles I have found it helpful to include another three columns in order to make statuses more clear. This helps pinpoint where bottlenecks might be during a Sprint visually on the board. This would make it:

To Do | In Development | In Testing | Tested w Open Issues | Done | Accepted

  • To Do = Stories that are in the sprint but haven’t been started yet
  • In Development = Stories that are being actively worked on by Development
  • In Testing = Stories that are under test
  • Tested w Open Issues = Stories that have been tested but have open bugs
  • Done = Stories that have been tested and all prioritized bugs addressed
  • Accepted = Stories that have been accepted by the Product Owner and ready for Production

The first couple days in a Sprint

It might be easy to think that the beginning of a sprint is a quieter time for testing roles since there are no complete User Stories to test. However, there are quite a few activities that this role can take on.

Story Kick Offs (Three Amigos)

This process is on my To Do list to write about in more detail. However a quick version is that before a developer begins work on a User Story they quickly huddle with the Product Owner and QA. This should be a 5 minute conversation reviewing the acceptance criteria as well as discussing in more detail how that User Story will be tested. The User Story may be updated based on this conversation. The developer will also has a clearer idea on what tests will likely be performed so is more likely to make sure those tests pass prior to passing it to QA for testing.

Design Reviews

Visual and User Experience Designers may have meetings where they discuss future designs and solicit feedback. It can be helpful for QA to attend those meetings in order to get a feel for how the product is evolving. They may even be able to provide some feedback based on product experience in order to prevent future feature issues.

Targeted Manual Regression

This is another To Do item to document in more detail. A list is created and updated with high level functionality areas or pages depending on what makes sense for your product. The list is prioritized and timeboxed. The beginning of a sprint is a good time to do some exploratory testing in less critical areas that QA may not have had time for at the end of the last sprint. (Update: Here is more detail on this process.)

Test Prep

Some stories may require some set up or even some documentation. A tester should be looking through all the User Stories in the Sprint and make sure they have what they need once development is complete.

Automation Updates

This may be updating scripts or creating new ones. It may also include reviewing runs and acting on failures.

Testing Stories

Once a story has been assigned to QA they can begin by ensuring the Acceptance Criteria has been met. How that is accomplished should be up to the individual tester. At this point they have a good understanding of the business needs and have reviewed the story with the Product Owner as well as the Developer. They should have the information they need in order to test effectively. However, the testing process should never just be focused on the Acceptance Criteria. Exploration is essential and it should be understood by the team that issues not explicitly stated in the User Stories may be reported. For some problems it may be helpful to discuss with a Developer or the Product Owner prior to actually writing up a bug. It may not be a problem at all and therefore doesn’t need to be documented. Also the problem might be so small that it can be fixed and tested on the spot.

The goal for QA should not be finding and writing up as many bugs as they can. The goal should be to provide fast business focused and actionable feedback on the product under test. Bugs are just one way of providing that feedback. Verbal conversations are another. Some level of documenting the decisions made during the conversations is helpful to make it clear to the entire team what was decided. Comments on User Stories can be just as helpful as Bugs in some cases.

If bugs are found and documented they should be associated with the User Story in some way. It’s helpful to think of a User a story as a parent and bugs, comments and sub tasks as children. This allows the team to look at a User Story and see everything associated with the story.

If no problems are found they can be assigned to the Product Owner for acceptance. Any documented Bugs should be assigned to the developer who worked on the User Story. Once fixed then they are assigned back to the person who wrote the bug. Once addressed and assigned back to QA they then ensure the bug is fixed and close it if it is. No bug should be reopened more than once. A developer and tester should work together to solve tricky bugs rather than ping pong tickets.

Rinse and Repeat

There is more to a functional team than I have documented here but I believe I hit a few highlights. I plan to dig a little deeper into some of these concepts. Questions, comments and concerns are welcome.

--

--