by Ardiana Spahija


by Ardiana Spahija


As an agile tool, Test Driven Development is still waiting for its big breakthrough. TDD evangelist David Nicolette explains why he uses it as his own default approach.

Although the tools have been around for fifteen years now, TDD has never become the normal way of building code across the industry. Rather it has become a sort of ”Think Different” movement, embraced by the leaders in the advancement of software development practices. One of the standard bearers is Dave Nicolette, an independent software and method consultant from Dallas and a popular Lean and Agile blogger.

Dave Nicolette is an independent consultant and popular agile blogger from the Dallas/Fort Worth area who is “bringing together the best of leading-edge methods to help businesses realize the maximum return on their IT investments.”

TDD may have become something of a buzzword in trade blogs and magazines, but still Dave Nicolette has only observed a small increase in actual interest and acceptance by developers in the software industry over the past 10 years. Most developers have heard of it, but have never tried it and most of those who have tried it have never made it their default method of building code. In Nicolette’s eyes, the majority of people who earn a living by writing software seem content to use techniques which, as he puts it, ‘create unnecessary extra work, misalignment with customer needs, defects, needless complexity, and delayed delivery’.

A new approach is needed

”There appears to be a general lack of awareness that the situation could be better,” Dave Nicolette says. ”TDD is not the only way to produce good results, but many developers evidently do not use any techniques – neither TDD nor anything else – to ensure their solutions are clean, accurate, and robust.”

For this to change, Dave Nicolette thinks that universities must teach students to build software in a new way. Nicolette is positive about the school where his son studies programming, as students here are expected to make the provided tests pass as well as extending the test suite to cover all necessary cases. They are also encouraged to collaborate rather than compete.

”I would like to see this approach become normal in schools, as it prepares students to do real work properly after graduation. Perhaps one day we will no longer have to re-train every university graduate before he or she can become effective on the job; or perhaps I am an optimist,” Nicolette says.

Several benefits

According to Nicolette, the value proposition of TDD is the same as it has always been: it helps us produce software that aligns well with customer needs, has relatively few defects, is understandable and maintainable, and is no more complicated than the problem requires. TDD also enables emergent design, which helps programmers to avoid over-engineering or “gold-plating”. It also enables flexibility in embracing change and meeting challenging time-to-market requirements.

”Code that lends itself to automated testing tends to be cohesive and loosely coupled. Benefits of this include ease of code isolation, separation of concerns, extensibility and maintainability. These characteristics lead to solutions that have longer production lifetimes and lower cost of ownership than solutions developed using traditional methods. We also avoid the opportunity cost of extended debugging and ‘hardening’ of solutions, in that development teams can turn their attention to a new project more quickly than is possible with traditional methods.”

An automated test suite may serve as an early warning system for regressions and a safety net for modifying the production code—including both functional changes and structural changes (refactorings). Without it, there is a high probability that any modification will result in new defects that cannot be detected in a reliable way before the code goes live.

Seeing what the system actually does

Because well-defined automated test suites unambiguously and completely describe the behavior of a system, TDD reduces the quantity of technical documentation that is usually necessary to support a solution. High-level architectural diagrams and a context diagram are often the only documentation artifacts needed.

”This has benefits for production support personnel as well as for new team members who need to learn an unfamiliar code base. They can easily see what the system actually does in its current version, rather than depending on an abstract description of what the system was intended to do as of the time it was first designed,” Nicolette says, and continues:

”On a personal level, the value proposition is simple: TDD makes the developer’s life easier. We chose a career in software development because we enjoy solving problems and creating useful, elegant software. Using TDD, we are able to spend most of our time doing those things. Using traditional methods, we spend most of our time debugging. Most developers find debugging to be less interesting than creating or improving systems. I am too lazy to spend time debugging problems that I can easily avoid creating in the first place. I am too lazy not to use TDD.”

”It is always risky to predict the future, but I think it may be safe to say that people will continue to advance these ideas, as they have done since the 1960s. I look forward to the day when someone invents a better way to build software than TDD. Until that day, I will continue to use the practice as my own default approach.”

TDD – a fifty year old concept

The basic concepts of TDD have been around for almost five decades. A good illustration are the proceedings of the first NATO conference on software engineering in 1968. Here the famous computer scientist Alan Perlis wrote that software is best developed by interleaving small amounts of testing with small amounts of design (coding). At the same conference, people were already talking about the importance of automated software testing. The software delivery process known as the V-Model, was created at Hughes Aircraft in 1982, and is based on ideas we would now call specification by example and TDD.

”I am puzzled at how quickly the software industry forgot these early lessons,” David Nicolette says. ”I am also surprised how slow we have been to rediscover them.”


Three impediments facing teams adopting TDD
–and how to resolve them

1. Difficulty of changing established habits
Technical professionals may understand the value of TDD on an intellectual level, but when they sit down to work they tend to do things in a familiar way. It is only natural for us to depend on familiar methods when there is pressure to deliver.

2. Poorly adopted code base
Most teams are working with an existing code base that was not originally developed using TDD or any other techniques that would have resulted in well-factored code. When TDD is first introduced, it is usually presented in the context of greenfield development. Faced with a monolithic legacy code base with significant design debt, developers have difficulty applying the TDD approach. Another set of skills is needed: to be able to break up monolithic code into testable pieces.

3. Large-solution thinking
Many people are not accustomed to reducing large solutions to small pieces. They can conceive of a solution only in terms of its final form, and they can only conceive of a test case that exercises the complete application end-to-end, with all external interfaces live. When developers decompose their designs, they tend to think about architectural layers or functional components rather than vertical slices of system behavior. Novices in TDD often try to describe, test-drive, and build large chunks of code. It takes time and effort for them to learn to decompose designs in a way that fits well with the TDD approach.

Related Posts


Subscribe to our newsletter.