Discover more from Coder Spikes
Test-Driven Development and Specification by Example
Change-resilient alternatives to phased planning approaches
The universal tendency to plan in phases is problematic as it is fragile to change. On the other hand, “responding to change over following a plan” is one of the Agile values, so it’s no surprise that we have many models and approaches that help product teams to iterate on what to build.
How this iteration happens from an implementation perspective is often less clear. What cyclical workflows does a team adopt that allow it to iterate continuously on not just what to build, but also on how to build it?
“What” and “how” are interwoven
First off, care should be taken when thinking about “what” (the business requirements) and the “how” (the implementation). If we think of them as separate concepts, then we ultimately invite what we are trying to avoid — phases in software development.
Consequently, product teams interweave the “what” and the “how” of software development by adopting Test-First Development1 methodologies. Writing tests prior to implementation continues to be one of the most misunderstood practices in software development. I have seen it referred to as a “testing strategy”, a “luxury”, or only applicable “when stakes are high”.
Given the cost of software development today (including the opportunity cost brought on by talent scarcity), when are the stakes low? For whom are tests considered a “luxury”? For the software engineers, so that they can do their job effectively and sustainably? For the team, so that it can safely pivot in response to rapidly-shifting requirements in the product? What cool-headed professional would not want to “indulge” such a “luxury”?
Perhaps most grievous of all is the misconception that Test-First Development is a “testing strategy”. In the sample phased software development model introduced in a prior post, Business Analysis happens three levels upstream from QA. In a sense, Test-First Development merges the four phases — Business Analysis, Architecture Design, Implementation, and QA — into one. Granted, testing (QA) is there, but so are the three other phases. To call Test-First Development a testing strategy is like calling a house a heating strategy. Yes a house will provide you with heat, but also shelter, safety, comfort, etc.
What is Test-First Development?
In XP Explained, Kent Beck defines Test-First Programming simply as:
… write a failing automated test before changing (or adding) any code.
Further, Test-First Development and Test-Driven Development (TDD) is often thought of as being one and the same. In actuality, the latter is a subset of the former. A straightforward way to determine whether or not TDD is being practiced is the length of the feedback loop. For TDD, this should be in the order of minutes, while other Test-First Development practices can have a feedback loop in the order of days to weeks.
Concretely, TDD leverages the red-green-refactor cycle of software development:
Red: Write a failing unit test2
Green: Write the implementation to make the failing unit test pass
Refactor: while keeping the tests green,
Improve the implementation (without changing the tests)
Improve the tests (without changing the implementation)
Again, it’s important to emphasize this point — a single cycle should happen in the order of minutes. If it takes longer, then it’s not TDD.
When TDD is not enough
Oftentimes, a business requirement cannot be satisfied with TDD alone. While on the one hand a bug can often be addressed by filling in an edge case via a few red-green-refactor cycles, a user story typically involves elaborate end-user interaction. It is impractical, for example, to write a unit test for a user navigating to a website, viewing an item, and then adding that item to their cart. Assuredly, however, the code generated by implementing that requirement would involve several TDD cycles and be supported by a number of unit tests.
For such a scenario, a complementary Test-First Development approach is needed. Specification by Example is one such approach:
Specification by example (SBE) is a collaborative approach to defining requirements and business-oriented functional tests for software products based on capturing and illustrating requirements using realistic examples instead of abstract statements. This approach is particularly successful for managing requirements and functional tests on large-scale projects of significant domain and organizational complexity.3
How Specification by Example prevents phases
The critical part of SBE is frequent collaboration between all four disciplines of software development, namely: QA, Product, Engineering, and Design. SBE encourages this collaboration to be continuous throughout the development process, thus preventing these disciplines from siloing themselves into phases. Here is how a specification written using SBE might look like:
Given a potential customer visits the website, and
given that they are using a computer with a pointing device,
When they hover over a featured product,
Then they are shown a product preview pop-over.
Note how SBE exposes the need to consider specifics, such as hovering-over a product only makes sense when using a computer with a pointing device (such as a laptop). This is one example of how SBE allows for feedback loops to be drastically shortened. On the other hand, a phased approach would mean that this UX design oversight would be caught in the QA phase, which would be too late in the process to address in a holistic way.
SBE aside, if your cross-disciplinary product team writes the actual specifications together — be it virtual or in-person, synchronous or async, using SBE or otherwise — you’ve already made great headway towards eliminating unnecessary phases and thus adopting an approach that is not just resilient to but also optimized for change. In other words, a team that continuously creates and refines specifications together in short iterations is poised to leverage change and transform it into an opportunity, as opposed to dealing with it as a threat.
I continue to be amazed by teams claiming to be Agile on the one hand, yet on the other hand the software engineers on these teams gladly accept handed down and “dev ready” specifications. Furthermore, when a specification is incomplete, it’s the software engineers who complain that the process needs to be improved by adding more implementation details before they even look at a ticket. I’ve witnessed that the attraction towards phased approaches doesn’t just come from management or from product — it often comes from the software engineers themselves.
How does your team implement requirements? Can software engineers provide feedback on the specification, and if so, how does that feedback happen? I look forward to hearing your thoughts and experiences.
Also known as Behaviour-Driven Development (or BDD)