Quality Commitments

Christoph
5 min readMar 3, 2020

--

Google currently lists 25,800,000 references to the phrase “committed to quality.” While this is easy to say, it’s a lot harder to define what that means from the perspective of anyone delivering customer value. With that in mind, I propose the following Big Idea: A team owns their definition of quality. They define what quality means to them with their stakeholders, and they take each defect that their work encounters as a learning opportunity in order to make their product better.

The goal of Quality Commitments is to codify an Agile approach to the non-functional requirements related to software quality. Wikipedia currently links to 84 different kinds of “quality attributes” or “ilities.” In a traditional waterfall approach, during the early stages of the initiation of a project, stakeholders such as architects and product owners would get together and define their quality attribute requirements. An interesting recommended way to do this is to hold a “quality attribute workshop” where they can pick which ones are important to them and set expectations for the team tasked with building the project.

Team Ownership

The most important thing is that the team has a collective sense of ownership of what defines quality for their project, that they can articulate what those commitments are and that they share those commitments with stakeholders.

Sprint Zero

I still believe that a Quality Attributes Workshop is a valuable exercise. The big difference is that instead of things being handed down from on high, in what would traditionally be called a waterfall approach, the team embraces quality as an essential Agile attribute of their work, and works with stakeholders to define together what their commitments to quality are in concrete terms.

The Quality Commitments Matrix

One technique that I’ve used to visualize this approach is to create what I call a Quality Commitments Matrix. It’s the product of a team’s Quality Attributes Workshop that acts as simple visual documentation of how they define quality, laying out the various methods that the team uses to ensure quality, points to any areas that are covered by other teams within the organization and calls attention to methods they won’t be using that they think could be a potential risk.

For example, if they aren’t going to be doing any performance testing, they should document that upfront and get sign-off from stakeholders, rather than experiencing the unforced error of the problem biting them in the ass months later. (BTW, at this point in my career, I’ve decided that software projects should almost always do performance testing based on the number of times I’ve seen teams hit with performance issues.)

Here are some possible dimensions covered in a quality commitment matrix:

Phase: Development, UAT…
Type: xUnit, linting, code coverage, performance, observability, BDD, manual…
Questions: Is it part of the definition of done? Does it run on CI? Does it cover regressions? Is it documenting?
Boundaries: Positive, Negative, Property-based?
Internal awareness: White Box, Black Box?

An advantage of this approach is that it helps everyone talk the same language when it comes to quality and testing. One of the most striking aspects of software development is how vague testing terminology can be, creating many opportunities for confusion and doubt that translate into friction and defects. By defining quality upfront and making it a living document, everyone is instantly working on the same page, and the team can introduce new members without needing them to spend weeks or months learning to understand what is expected of them.

Once completed, rather than there be a person or group that defines what quality is, it should be apparent to any person working on the project what their commitments to quality are, and how they work to keep them.

Here’s an example of a possible JavaScript project’s Quality Commitments Matrix:

Quality Commitments Matrix

Eventually, I could see this matrix be something dynamic that gets updated by CI servers after tests are run on a build.

The Role of QA

The most significant shift in this approach is the role of QA. On a traditional team quality is someone else’s responsibility. There are QA engineers who write test scripts and perform hours of manual regression tests. The problem is that many software systems are simply too complex for even a million testers to validate it is performing correctly.

Rather than eliminating the role of QA, we recommend that the role becomes a more holistic part of a software team. There is simply no replacement for an expert exploratory tester attempting to push the natural boundaries of a system. They become one of the most critical lines of defense against system issues.

A Learning Organization

One of the critical values of an Agile approach to quality is that any issues encountered by the team are used to improve the quality of their product.

A key indicator for a healthy relationship with quality in a team is how they treat defects reported for their system. If people are blamed for issues, and their careers are negatively affected by them, a team will do almost anything to cover up issues or attempt to pass the buck to someone else. The news is filled with examples of just how costly such a blame culture can be to a company’s bottom line.

The healthy, agile approach recommended here is to take every issue reported to a team as a chance for the team to learn from it and make their commitments to quality stronger.

The algorithm is simple. If a defect is encountered, it can mean one of two things:

  1. The team was not keeping its quality commitments.
  2. There is a gap in their quality commitments.

If it is possible to point to a place on their matrix that should have caught an issue, and didn’t, then the team isn’t keeping their commitments. This is a chance for the team to learn from their mistakes. Are they coding too fast? Are they being sloppy in code reviews? In the end, the defect should be documented with the lessons learned.

However, if the team kept all of their commitments, and the issue still got through, then the team needs to either adjust their commitments or document why they feel it is not in the best interest of the team to justify changing their approach. Perhaps the defect isn’t significant enough, or the amount of work required to catch the issue in the future isn’t worth it or could significantly affect their timeline.

The important thing is that the team learns, and they can share these lessons.

--

--