I was, once upon a time, a huge proponent of agile. I thought the manifesto made a lot of sense, and it fit with my observation that a small group of skilled, dedicated programmers, working closely with the client, can almost always produce software more efficiently than a large consulting firm. Yet, over time, I've also seen Agile applied at companies with dozens or hundreds of programmers, and if anything the results were worse than that hokey old religion of Waterfall.
Now, proponents may argue that I just haven't seen enough such projects, but I have been on projects that ranged from myself, to a team of about a dozen people to a team of several hundred developers. What I find is what my original intuitions suggested - that Agile works great when you have "a small team of skilled, dedicated ..." you get the idea, but that after perhaps a dozen people, agile begins to look suspiciously like Waterfall with Scrum.You have the stories, the fun Fibonacci series apps, the hockey stick that gets passed around.
What you don't get is good software. There are several reasons for this.
An account manager for a consulting company has a major challenge when putting together a team. Some consultants are simply better than others - they have more experience, they are more talented - and as such they are the ones who usually end up getting seen earliest by prospective clients. However, because they are better, they are also in demand on more projects, so you can't necessarily ink them in as being available for every project.
Instead, most Account managers concentrate on roles. You need a back end developer, a front end developer, a systems architect, a UX person, a business analyst, a data modeler, a couple of DevOps people ... and before you know it, you're suddenly talking about a twelve man team to do a project that could (and likely should) be done by three or four. If the ones put into the assigned roles are not as skilled, then this can snowball, especially if they are doing incremental development without clear coordination.
Typically these people also start on day one, even before anyone has set to figure out what exactly is being built. This can create sunk cost churn as a great deal of development billing that is essentially building prototypes, without anyone having a clear idea about data structures or API conventions. On paper, there's a lot of activity, but the project is essentially stalled in neutral with the gas pedal pushed all the way down.
Of course, they are still meeting every morning, still going through the litany of what they did yesterday and what they expect to do today, still passing around the hockey stick and playing with the Fibonacci cards, but somehow not much is actually happening - but now it is being done agilely.
Now, the account manager is happy, at least at first, because they are billing everyone in the shop, including the newbies, at decent rates. The program manager is happy, because he now has dozens of people who are working for him, and because, according to the Gantt charts and the Jira dashboards, everyone is making great progress.
For starters, with small teams, one person may perform several roles simultaneously. A full stack developer means that he or she can work at any point in the stack, from the user interface to the deep bowels of database design and development. Get two of these programmers together, and you have enough to insure that there's very little blocking, and rather than sitting idle for a most of a week a developer can go on and do something else. Questions of interfaces can usually be resolved in a chat session, and one person can test while the other person is coding. What's more, agreeing on this distribution of work is trivial. "Hey, Bob, I'm going to be working on the view modules today, stay out of those.". "Sure thing, Alice."
Agile works great here, if the project itself is really self contained. On the other hand, let's say that you need to work out a services protocol with other groups. If that protocol already exists and is well documented, great. If it's not, then dependencies creep in, as well as larger issues about how well the project needs to inter-operate with other projects. That's when software becomes political. It should not take a long time to develop a standard, but it always does, because every stakeholder ends up with a say - usually trying to reduce the amount of work that they would end up having to do.
If you are a manager, there is one aspect of agile development that should freak you out. At the end of a sprint, you are supposed to throw away the code you just wrote. That, in a nutshell, is what iterative development means. Your developer is writing a prototype, something to figure out how to solve a problem. Most program components are written three times - the first to get an idea how to solve a problem, the second to actually solve it, and the third to fix the problems introduced by the second. Some programmers may claim to be able to do all three in the same step, but if you watch them work, you'll discover that they just do this iterative process more efficiently, or tend to be more reliant upon frameworks where someone else has done this iterative process for them. This is called integration.
Most non-technical managers see programming as a mechanical process. You write code, it's done, you go onto the next module. In my experience, writing software is much more akin to writing a book. You write a chapter, you write a second chapter, you rewrite parts of the first chapter to sync with something you introduced in the second, you start on the third chapter, you throw that chapter out because it's not going where you want it to go, then you go back to the second chapter, move a part to the first chapter and another part to the as yet unwritten fourth chapter. The code is constantly evolving. In many respects that's why you go from prototype to prototype to final product.
Yet this introduces two problems - knowing when you're done, and scope creep. These things are anathema to account managers, because they represent points where profits turn to losses. When the issue involves a couple of programmers, they can be strong-armed into wrapping up the most current iteration and presenting it as done, even though they know it's not. When it is dozens of programmers, there are usually far too many inter-dependencies at work to do the same. This is a key point with agile - you should always have a more or less working version of the application, for some definition of working, but again, as you scale, the likelihood that the definition of "working" is even in the same ballpark as the client's is small to non-existent.
Scope creep is the other side of this, and is typicaly the fault less of developers going crazy with adding new futures as it is under-specifying the functionality of the software that you're writing, because everyone jumped into code development too early. Because there will be things that are needed but not properly understood (data design being a glaring example), what may be seen as necessary by the developer may be seen as superfluous by the account manager. Later in the process, when developer and account manager are at loggerheads about why some critical piece is not done on time, that's very frequently the reason.
Iteration is a necessary part of software development, but too often iteration is balked at by management, who feel that it is unduly adding to the cost of software that should be written better the first time. If software was a truly mechanical process, that would be true, but as it is all too frequently a process of discovery, a commitment to real iteration at the outset can reduce show-stopping bugs near the end of the cycle.
There is a concept that plays a major factor in why bigger is not always better, and it has to do with communication impedance. The ideal size of a team is about seven. I have a few theories about why this is the case. Partially, it represents about the minimum number of people needed to solve 84% of the the problems (one standard deviation) encountered. It will also represent that group within one standard deviation either way of the effective intelligence of the group.
Effective intelligence is common shared jargon and familiarization with a set of tools. Typically, the person acting as leader of that team will usually have lower effective intelligence (EI) than many of the members, because they have to communicate to other people who overall have lower awareness of that jargon (this is not reflective of overall intelligence, only of specialization). This also accounts for less experienced people in a group, which also lowers overall EI.
Every link outside of this group between specialists will thus be "stepped down" or impeded to some extent. The more that you specialize groups by function (which almost always happens, and is almost always a bad mistake) the higher the degree of impedance - the more likely that concepts are misapplied, that potential errors are not caught, that there is a less accurate perception about the true state of a project. This will be just as true for an "Agile" shop as it will a traditional waterfall shop with the same number of people, especially where iterative development is one of the desired methodologies.
Perhaps the best solution there is the least intuitive - rotating team members from one team to the next, so that ideas that developed in one group can be disseminated to others. This reduces siloization, raises the average effective intelligence of the organization even if it is somewhat disruptive of the intelligence of any given group, and gives rise to a larger number of potential viewpoints. It is also, of course, almost never done.
I'm going to say something that many Agile "gurus" may consider heretical. Any large piece of software will take exactly the same time to accomplish via Agile as it would via Waterfall. Any time benefits that arise from Agile are almost always going to be directly attributable to a better software environment, and a better mix of people than it would because of the methodology.
Now, I think that the quality of software under Agile may, in many cases, be better, which makes it a worthwhile goal, but I've heard too many people tout the time to completion benefits as a big reason for Agile in the first place, and this simply is not so.
Agile works upon the idea that you "come up for air" more often, typically every two weeks, and use that time to course correct. This is usually combined with a sprint of sprints, which typically happens at the end of six sprints. This is a broader period where you get all of the sprint teams together and show where you are, and determine if you're on track.
The problem with this is that a great deal of programming involves non-visual processes. Data modeling, services deployment, user interfaces that need dummy data that hasn't been developed and so forth. The focus on demoing is something that tends to emerge in any agile shop, and the danger that I've seen is that it both detracts significantly from the real work at hand and gives a very misleading picture of how far along the project really is. This is especially true for data driven applications, as often the visually interesting pieces can really only be shown fairly late in the cycle.
A second danger with short sprints is that it make for very short term management, especially when goals are not firmly established ahead of time. This leads into a condition where expectations are set that a given set of components should be done in a single sprint. This can become even more dangerous when a manager wants to be "aggressive" in their schedules to better prove that they can get better numbers than other managers, even if it comes at the cost of developer fatigue, long hours, burn-out and ultimately attrition.
What makes this even more pernicious is that as components move from stand-alone testable modules to integration, deadlocks and blockages become the norm. This will almost always result in developers throwing more people into the "slow" group, which actually serves to lower the effective intelligence of the group (because the new hires need to be "brought up to speed" and are often hired based upon the (low) cost of developer rather than skill of developer). In a true agile environment, this would involve a reassessment of the complexity of the problem, but typically by this point, agile has given way to the panicked manager methodology of code development, which never ends well.
The solution to this might be to increase the length of sprints, perhaps to four weeks, automating a demonstration framework, and more clearly identifying at the outset what the expectations for those demos are. This also takes into account non-development time - decision times by managers, non-productive meetings, vacations and special events, and contingencies in case of blockages. Further setting the expectations up front that Agile will not get your code to market any faster would also help, but I doubt seriously that this particular myth will be dispelled any time soon.
While I've covered a lot of ground here, I think there is one more myth that needs to be addressed. This is the assertion in Agile that the customer should in fact be a key stakeholder in the development process. As a consultant, I have worked with customers and clients for more than three decades, and have known only a few that I can point to that were "ideal" Agile customers.
First, I would question whether Agile is in fact the best methodology for a consultancy in the first place, beyond its importance in marketing material. A few consultancies are rare in that they really are dedicated to solving the customer's problem quickly and efficiently. They are rare because such consultancies typically have trouble staying in business long - they have to charge a premium to make a profit, and if they do not have top-notch talent, then they will be competing against less expensive consultancies that are looking at putting butts in seats for the long term, even if the end goal is surviving before they're fired.
That issue aside, the effective intelligence conundrum comes up in spades for business people. This is not to say that business people are stupid, but typically, the effective intelligence for communication among developers will be much higher than it will be between the developers and the associated business people. If it wasn't, those same business people would solve the problems themselves and not hire the developers in the first place. This means that the best business stakeholders for Agile will be those that are most familiar with the terminology, technology and methodologies, yet have the authority to make spending and hiring decisions necessary to support that development. This is a comparatively rare role, and one that should be a pre-requisite to have in place before any large scale project is undertaken.
In cases where you do have outside consultants performing in an Agile role, companies should also "embed" their own developers in the software process. Those developers know the company systems and protocols, and will need to become reasonably conversant with the new project in order to maintain it. Remember that creating training takes time (and unless you have someone specifically dedicated to that role, cannot really even start until a project is near finalization). More than a few projects have failed because, even though the project achieved it's goals, nobody knew how to use it.
Arguing about whether Agile is dead (or even ineffective) misses a significant point. There are benefits to using Agile tools and methodologies when they are appropriate, but no methodology should be seen as anything except a template to more effectively produce software for a given purpose.
Additionally, the software that is created today is not the software that was created even ten years ago, so we should not expect methodologies to stay consistent. However, and this is crucial, it should be recognized that software does not exist in isolation, and that now more than ever, the primary aspects of software development are largely dependent upon non-coding activity - management approval, competing software projects, regulatory issues, standards establishment, security hardening and testing.
As software becomes easier (and hence faster) to develop, it will be these human activities that will actually set the tempo for any projects, and as such any thinking about methodology should recognize that software development time is not the primary factor in getting projects done.
Kurt is the founder and CEO of Semantical, LLC, a consulting company focusing on enterprise data hubs, metadata management, semantics, and NoSQL systems. He has developed large scale information and data governance strategies for Fortune 500 companies in the health care/insurance sector, media and entertainment, publishing, financial services and logistics arenas, as well as for government agencies in the defense and insurance sector (including the Affordable Care Act). Kurt holds a Bachelor of Science in Physics from the University of Illinois at Urbana–Champaign.