Book Image

The Agile Developer's Handbook

By : Paul Flewelling
Book Image

The Agile Developer's Handbook

By: Paul Flewelling

Overview of this book

This book will help you overcome the common challenges you’ll face when transforming your working practices from waterfall to Agile. Each chapter builds on the last, starting with easy-to-grasp ways to get going with Agile. Next you’ll see how to choose the right Agile framework for your organization. Moving on, you’ll implement systematic product delivery and measure and report progress with visualization. Then you’ll learn how to create high performing teams, develop people in Agile, manage in Agile, and perform distributed Agile and collaborative governance. At the end of the book, you’ll discover how Agile will help your company progressively deliver software to customers, increase customer satisfaction, and improve the level of efficiency in software development teams.
Table of Contents (16 chapters)

Incremental – adaptive versus waterfall – predictive

The Agile Manifesto advocates incremental delivery using adaptive planning. In this section, we contrast and compare this approach with the previously more traditional approach of Waterfall delivery/predictive planning.

In the following section, we'll look at both approaches and some of the impacts on how we deliver software. Before we launch into the detail, here's a quick comparison of the two approaches:

The Waterfall process and predictive planning

The traditional delivery model known as Waterfall was first shown diagrammatically by Dr Winston W. Royce when he captured what was happening in the industry in his paper, Managing the Development of Large Software Systems, Proceedings WesCon, IEEE CS Press,1970.

In it, he describes a gated process that moves in a linear sequence. Each step, such as requirements gathering, analysis or design, has to be completed before handover to the next step.

It was presented visually in Royce's paper in the following way:

The term Waterfall was coined because of the observation; just like a real waterfall, once you've moved downstream, it's much harder to return upstream. This approach is also known as a gated approach because each phase has to be signed off before you can move onto the next.

He further observed in his paper that to de-risk this approach, there should be more than one pass through, each iteration improving and building on what was learned in the previous pass through. In this way, you could deal with complexity and uncertainty.

For some reason, not many people in the industry got the memo though. They continued to work in a gated approach but, rather than making multiple passes, expected the project to be complete in just one cycle or iteration.

To control the project, a highly detailed plan would be created, which was used to predict when the various features would be delivered. The predictive nature of this plan was based entirely on the detailed estimates that were drawn up during the planning phase.

This led to multiple points of potential failure within the process, and usually with little time built into the schedule to recover. It felt almost de rigueur that at the end of the project some form of risk assessment would take place before finally deciding to launch with incomplete and inadequate features, often leaving everyone involved in the process stressed and disappointed.

The waterfall process is a throwback to when software was built more like the way we'd engineer something. It's also been nicknamed faith-driven development because it doesn't deliver anything until the very end of the project. Its risk profile, therefore, looks similar to the following figure:

No wonder all those business folks were nervous. Often their only involvement was at the beginning of Software Development Life Cycle (SDLC) during the requirements phase and then right at the end, during the delivery phase. Talk about a big reveal.

The key point in understanding a plan-driven approach is that scope is often nailed down at the beginning. To then deliver to scope requires precise estimates to determine the budget and resourcing.

The estimation needed for that level of precision is complicated and time-consuming to complete. This leads to more paperwork, more debate, in fact, more of everything. As the process gets bigger, it takes on its own gravity, attracting more things to it that also need to be processed.

The result is a large chunk of work with a very detailed plan of delivery. However, as already discussed, large chunks of work have more uncertainty and more variability, therefore calling into question the ability to give a precision estimate in the first place.

And because so much effort was put into developing the plan, there becomes an irrational attachment to it. Instead of deviating from the plan when new information is uncovered, the project manager tries to control the variance by minimizing or deferring it.

Over time, and depending on the size of the project, this can result in a substantial deviation from reality by the time the software is delivered, as shown in the following diagram:

This led to much disappointment for people who had been waiting many months to receive their new software. The gap in functionality would often cause some serious soul-searching on whether the software could be released in its present state or whether it would need rework first.

No-one wants to waste money, so it was likely that the rollout would go ahead and a series of updates would follow that would hopefully fix the problems. This left the people using the software facing a sometimes unworkable process that would lead them to create a series of workarounds. Some of these would undoubtedly last for the lifetime of the software because they were deemed either too trivial or too difficult to fix.

Either way, a business implementing imperfect software that doesn't quite fit its process is faced with, often undocumented, additional costs as users try to work around the system.

For those of us who have tried building a large complex project in a predictive, plan-driven way, there's little doubt it often fails to deliver excellent outcomes for our customer. The findings of the Standish Group's annual Chaos Report are a constant reminder, showing that we're still better at delivering small software projects over large projects, and Waterfall or predictive approaches are more likely to result in the project being challenged or deemed a failure regardless of the size.

Incremental delivery and adaptive planning

Incremental delivery seeks to de-risk the approach by delivering small chunks of discrete value early and often to get feedback and reduce uncertainty. This allows us to determine sooner rather than later, whether we're building the right thing.

As you can see from the following hypothetical risk profile, by delivering increments of ready-to-use working software, we reduce risk significantly after only 2 or 3 iterations:

This is combined with an approach to planning that allows us to quickly pivot or change direction based on new information.

With an adaptive plan, the focus is on prioritizing and planning for a fixed horizon, for example, the next three months. We then seek to re-plan once further information has been gathered. This allows us to be more flexible and ultimately deliver something that our customer is much more likely to need.

The following diagram shows that each iteration or increment in an adaptive planning approach allows an opportunity for a correction to the actual business needs: