I’ve seen my fair share of well-intentioned, well-researched big projects—big as in projects that run twelve months or more. All of them either did not finish in time or had significant cuts in scope.

Why is it impossible to plan and execute a project that runs over many months?

One of the “big projects” I witnessed was estimated to be finished in 18 months. It has now been running close to two years with a much bigger team than predicted and a current estimation of six more months.

And this is a pattern. The Standish Group, which has a database of some 50,000 development projects, looked at the outcomes of multi-million dollar development projects and ran the numbers for Computerworld. Of 3,555 projects from 2003 to 2012 that had labour costs of at least $10 million, only 6.4% were successful[1].

In the literature about conventional project management, the general consensus is that scope growth happens because the planning wasn’t good enough. However there is a limitation on planning. Despite best efforts, complex problems can’t be planned for.

Software projects are complex problems. They will have unknown factors that will only be discovered while the project is underway. Planning, no matter how sophisticated, will never be able to foresee these unknowns.

Usually a factor will be calculated into the estimate to cover these unknowns, but this factor is never enough.

Douglas Hofstadter’s law [2] sums it up quite well:

“It always takes longer than you expect, even when you take into account Hofstadter’s Law.”

“…but why?”

At the Agile Business Analyst Masterclass, I learned that the industry number for scope growth is 2% per month.[3]

I decided to run some numbers in a spreadsheet. Assume a project is estimated to run for 12 months with a scope of 100. Further assuming the assigned team who estimated it won’t change over the year, the chart would look something like this:

Scope of the work

If the team works with a perfect velocity, they need 17 months to catch up with the exponential curve of the scope growth. That’s almost 50% over budget.

If I run the same experiment for an estimated 18-month project, the team needs 39 months to finish the project:


If I run it with a project estimation of two years, the project will never finish. The scope growth is greater than the work delivered by the team:


“…but in a real life situation, something would be done about this!”

Indeed, in reality a product manager would not wait until the actual release date. A conventional waterfall environment also has feedback loops and potentially a steering committee to keep an eye on scope and costs. Unfortunately that is when is when Tom Cargill’s ninety-ninety rule [4] comes into effect:

“The first 90 percent of the code accounts for the first 90 percent of the development time. The remaining 10 percent of the code accounts for the other 90 percent of the development time.”

This is even more true in a non test-driven development environment without sprints. Close to the delivery date, the development team will report that they are right on track, until all of a sudden, progress grinds to a halt.

It is too late to abandon the project. After all, according to the reports 90% of the work is already done. So extra budget is assigned, the team size grows, and pressure on everyone involved increases.

The outcome is lots of overtime, sinking motivation and, as a direct result, a lot more bugs in the code — all of which add to the development time.

This is not what was intended, but unfortunately, it is the reality. The project I was talking of is seeing the team now working seven days a week with little increase in progress, but people have resigned as a direct effect of this.

“What now?”

A mixture of the impossibility to accurately predict the solution to a complex problem combined with the human trait of trying just this impossible feat is what leads us down this path.

A lot has been written about how to avoid this, but I don’t think it is possible. We are all human but despite knowing that, we are fooled by our natural optimism and we keep repeating the same patterns.

Yet, I did learn how to navigate around this:

  • Have a plan for the big picture but don’t spend time scoping and estimating beyond the first months.
  • Plan to release in less than six months and expect that it will take longer than planned. Short release cycles help mitigate the effects of the ninety-ninety rule and Hofstadter’s Law.
  • Accept that writing software is a complex task and can never be estimated accurately—so don’t waste time, money and people’s nerves.

“Release early, release often (And listen to your customers.)” – this is nothing new. Eric Raymond wrote this in 1997 and it still holds true. If a product has no plans to put anything in front of users after four to six months, something is probably wrong.

This is a big part of why I believe the Agile methods are by far the best way to develop software and despite recent controversies, I do believe Agile is here to stay.


  1. http://www.computerworld.com/article/2486426/healthcare-it/healthcare-gov-website–didn-t-have-a-chance-in-hell-.html
  2. https://en.wikipedia.org/wiki/Hofstadter%27s_law
  3. http://www.betterprojects.net/2011/05/what-does-2-percent-scope-growth-per.html
  4. https://en.wikipedia.org/wiki/Ninety-ninety_rule#cite_note-Bentley1985-1

This post has been republished with permission from Tobias Moos, Business Analyst at DiUS. See his original post from his blog Tolog here: http://tolog.tumblr.com/post/139454355277/on-why-big-projects-fail