Let’s go straight to the point: in the IT industry we made a major mistake about 60 years ago.
An understandable mistake, but nevertheless something that costed us a great amount of money and energy, something that we are still paying dearly for.
That mistake was to think that we could build software in the same way as we had built anything else since, especially mass-produced tangible goods.
That was a bad idea, simply because it doesn’t work; or, at the very least, it’s a highly inefficient way of building software.
The good news is, of course, that we are now aware of that – and the Agile approach is one clear and solid proof of that awareness, along with other new developments in the software industry.
However, let’s start from the beginning.
An Easy Mistake To Make
As a species, we learned to build stuff.
And what we learned about building stuff we pass on from generation to generation, in a process of continuous education.
Let's play a little game for a second: let's pretend that all that mankind has learned about building stuff you could learn in a one-day course, from 9 am to 5 pm.
Next, let's define a couple of reference points in our training program: let's arbitrarily say that we have been seriously building stuff since the pyramids, about 3000 years ago; and let's say that we have been seriously building software for the last 60 years, roughly.
If we set the building of the pyramids at the beginning of the course (9 am) and we set the present day at the end of the course (5 pm), with a little math you'll find out that, in an entire day of training, you have been taught about building software only from 4:51 pm to 5:00 pm (the 9 minutes are 60 years on a scale of 3100 years) and that’s assuming there were no breaks in our course!.
Everything else you learned during that day had very little to do with building software, or nothing at all.
So it's not surprising that, when we started building software a few decades ago, we applied the same practices that we have used for centuries to build other things.
The problem, of course, is that we didn't realize that software was a completely new and different story, something that mankind had never created before.
I'm not talking about the technical components of software, what I'm saying is that never before in our history there was an industry that would build and sell, on planetary scale, a totally intangible product.
How Software Is Different
Software is not the first intangible thing that we ever produced: music is certainly another and it dates way back than software.
There are in fact similarities between music and software: music exists only in the instant when it’s played, software exists only in the instant when it’s run.
After that instant, both go back into the void whence they came; anything else is just a static representation of theirs in an arbitrary form – a music score or lines of code – that our mind needs, to be able to manipulate that intangible product.
Another characteristic of software is that, mainly, it’s produced by the collaborative effort of a group of people.
Considering the complexity of most problems that we create software to solve, that’s reasonable: it would be almost impossible for a single person to develop a very complex application (one that solves a complex problem) in such a short amount of time to provide actual business value.
In other words, if a single person spent five years to develop a product, it’s very likely that after that time the potential customers would have lost interest in the product, at least in that form.
So we need teams that work together to develop something intangible for which there is only an arbitrary representation.
Finally, software development is intrinsically heuristic, that is, there is an inevitable amount of exploration and learning that comes with developing software.
After all, the main reason why someone would spend time, money and energy to develop new software is because there isn’t already a piece of software that does exactly the same thing: nobody ever wrote it.
So there is always a certain amount, big or small, of “unknownness” in developing software, which is why we need time to explore that unknown, learn about it and learn how to deal with it – and the amount of time we need is very hard to predict precisely.
For this reason, software development is not a deterministic activity, but an empirical one, at least in part.
So, the act of developing software is the collaborative effort of a group of people that heuristically develop a completely intangible product of the human mind.
And we’ve made a global industry out of that.
Now, don’t tell me that this isn’t quite new for us humans!
We Are Influential
At the point where we are in the history of mankind, the software industry is one of the most influential activities on the planet, meaning that we touch and change, from a minor to a major degree, the lives of billions of people.
There is software in pacemakers, just to give you an example of how much we may affect one person’s life.
To mention a few other examples (but you can grow this list by yourself) software is used:
- to make airplanes fly, ships sail and cars run;
- to move huge amount of money on the planet everyday;
- to assist virtually all science researchers;
- to create movies, music and other art forms;
- to reconnect distant people;
- to give freedom of speech – and freedom to read – to everyone who has an Internet connection (in most countries)
And of course, there is software in many personal appliances including your washing machine, your TV set and your telephone.
This list is far from being exhaustive; it’s just to give you an idea of what I mean when I say that our industry is influential: we directly and indirectly affect the lives of billions of people – and, in truth, we try to make some areas of their lives easier.
We Are Inefficient
Interestingly enough, we are also one of most inefficient industries on the planet, especially considering the big responsibility that comes with our enormous influence.
When making such a bold statement, one could mention the Standish Group’s Chaos Report to support it, although there is quite a lot of controversy about the figures in the report and, more specifically, about the analysis and data collection methods used to produce it.
If you have been in IT long enough, though, you don’t need figures to know how frequently a project may suffer from one or more of the following issues:
- project runs over time (product doesn’t meet the delivery date)
- project runs over budget
- the product has serious functional defects, up to the point that customers aren’t keen to buy it or use it (diminished business value)
- the product doesn’t meet expectations, doesn’t have some of the expected functionalities or, in general, doesn’t solve the problems that users were hoping for it to solve (diminished business value again)
Sometimes – too frequently, actually – these issues are so serious that the project gets cancelled before anything is delivered; and if you like figures, the Chaos Report states that about 30% of projects fail completely, while about 50% are challenged by the issues listed above.
Even ignoring the figures and relying only on my years of experience in IT, I can’t think of another industry on the planet that is to ingrained into the lives of billions of people and, at the same time, scores so badly in terms of quality and performance.
And I say this with all the respect I have for the industry I’ve been working in for the last twenty-some years, and most of all with the respect I have for its people.
But They Need Us
In my opinion, the only reason why we could get along all this time – using an inadequate production model, being so inefficient and delivering products that are so distant from what people needed – is just because we happen to live in times when people on this planet crave for software.
In other words, our industry survives because there is far more demand for software than we are able to supply and, regardless of how inefficient and sloppy we are, there will always be someone asking us to create new software.
Of course it’s the industry that survives, not the individual companies: with such a high project failure rate, some companies are forced to shut operations down or to lay off quite a few people and enter a slow-paced or fast-paced death march.
And now for the good news: the fact that people crave for software means that there is a lot of room, a lot of business opportunities, for IT companies that are going to be efficient and are able to provide their customers with quality products in reasonable time and at a affordable price.
Basically, there is a lot of room for IT companies that will be able to stand out and move away from the old production model.
How We Got Here
As I said earlier, when we started building software a few decades ago, we didn’t know or realize that software was a completely different story, something that no industry had ever produced in the history of mankind.
So we started building it just like we used to build other things, such as mass-produced tangible goods.
Taylor defined his theory at the end of the 19th century, when industrialism and the Industrial Revolution required labor to operate the factories and many people left their occupation as craftsmen or farmers to go to work in assembly lines.
As such, Taylorism is focused on optimizing labor and processes in a repetitive, deterministic and measurable workflow.
There would be a lot to say about Taylor's theory to make it justice, but for the sake of this discussion there are two aspects that are of interest (from Wikipedia):
- the process should be enforced by management
"It is only through enforced standardization of methods, enforced adoption of the best implements and working conditions, and enforced cooperation that this faster work can be assured. And the duty of enforcing the adoption of standards and enforcing this cooperation rests with management alone."
- workers are incapable of understanding what they are doing
"'I can say, without the slightest hesitation,' Taylor told a congressional committee, 'that the science of handling pig-iron is so great that the man who is ... physically able to handle pig-iron and is sufficiently phlegmatic and stupid to choose this for his occupation is rarely able to comprehend the science of handling pig-iron.'"
In practice, Taylor was suggesting that all the thinking and planning should be done by well-educated managers, and then the plan is passed down to uneducated workers for its actualization.
Since the produced goods are tangible and the process is algorithmic and deterministic, it's then possible to compare the output from workers (product) with the input from managers (plan).
Of course, Taylor realized that working many hours a day on a production line, performing a repetitive task and without ever seeing the final result of your labor (you work only on a portion of the product) would provide no intrinsic motivation for workers to do their job well, so the solution was to measure performance and provide monetary incentives or deterrents.
Also, at this point workers become interchangeable and so they are frequently referred to as "resources" rather than "persons".
For an assembly line, the model envisioned by Taylor may be understandable and, in fact, it's been used in countless factories and companies in the last two centuries.
It's been around for so long that it became ingrained in our society and in our very lifestyle, up to the point that it's been imported into companies that don't even have an assembly line, as if this were the only way we have to produce something (that's why the lines above may have rang a bell in you, even if you don't work in an assembly line).
Probably more unconsciously than not, this approach was also adopted by IT companies; but as we have seen, software development is not a deterministic, algorithmic, repetitive process to mass-produce tangible goods.
We who work in IT are knowledge workers who create intangible goods; and we do that empirically, not algorithmically, because of the very nature of the products we create; and we do it in teams, not in sequential assembly lines.
Aside from common sense, there is plenty of scientific evidence that shows how negatively knowledge workers are affected by a factory-like approach.
Here We Call It Waterfall
A very concrete and dramatic incarnation of a Taylor-like approach to software development is what ended up to be called the "Waterfall model", still dominant in the IT industry.
Waterfall mirrors, for software, many of the principles of Taylorism, including the fact that the process is sequential, predictable and all design happens upfront, eventually delegating implementation to labor workers (interchangeable programmers or "resources").
The Waterfall model is attributed to Winston Royce, but the truth is that Royce never believed in a sequential production model applied to software development.
In his paper "Managing the Development of Large Software Systems", when talking about a sequential model for software development, he says:
"I believe in this concept, but the implementation [...] is risky and invites failure."
Risky and prone to failure – that depicts pretty well the fate of so many IT projects developed worldwide.
What Royce suggested in his paper was, instead, an iterative model where feedback loops in the production process would play a major role, which is exactly what you need to do when you work in an empirical context: constantly measure and adapt, as fast as you can.
So, going back to Waterfall, why on earth would a manager or an entrepreneur want to use such an approach to create their software products, when studies and empirical evidence would suggest otherwise?
In my opinion, because of the three thousand years we spent building stuff and because of the two centuries of mass-producing tangible goods since the industrial revolution.
On Command and Control
One of the byproducts of a Taylor-like model, including the Waterfall one, is the development of a "Command and Control" culture.
In its military definition, Command and Control is "The exercise of authority and direction by a properly designated commander over assigned and attached forces in the accomplishment of the mission".
You can see its resemblance with Taylor's idea that management should think and impart orders, while labor workers should execute without understanding or questioning what they are asked to do.
I've seen two major problems when a culture of that kind is brought into software development organizations:
- technical team members show a decreased willingness in taking responsibility for what they do, as well as a decreased level of initiative and creativity (as explained by Daniel Pink) which would instead be quite valuable in our field
- this culture is self-reinforcing and it becomes therefore a major obstacle to change in favor of a better production environment
To elaborate a bit more on the second point, if one of the founding ideas of Taylor's production model is that management should enforce the process on labor workers (and that idea somehow stuck, even when not explicitly stated today) that means that the kind of personality that is more likely to climb up to a managing position is a dominant one.
In other words, the reason why we tend to see dominant managers is because the production model that we use calls for a dominant personality in managers.
Since the model also measures performance to assign incentives and people in managing position have a bigger impact on production (because they manage many other people) they get bigger incentives and tend to be less keen of challenging the status quo.
Unfortunately, in IT being a dominant manager has a negative influence on the productivity.
So we have an industry based on knowledge workers, which is in dire need of creativity, initiative, collaboration and knowledge sharing to perform at its best.
At the same time, that industry is designed to have dominant people float up more easily than others to managing positions, where they frequently (and most of the time involuntarily) squander the exact resources they are in dire need of.
Let’s Wrap It Up
A Taylor-like approach, being designed for algorithmic mass-production of tangible goods, is an inadequate model for knowledge workers and, specifically, for software development, where people produce intangible goods working in a collaborative and heuristic context.
The attempt to use a Taylor-like approach (which we call Waterfall in IT) may in fact be the main reason behind the high level of inefficiency and the low quality frequently associated with the IT industry, something that many companies (not to mention users and customers) just can't afford anymore.
Also, the command-and-control culture that comes with such an approach invites people to play roles, to focus on incentives rather than quality, to assume dominant positions (managers) or to avoid taking responsibility (teams).
Such a culture doesn't motivate nor inspire knowledge workers and prevents people from developing systemic awareness, which is the basis for self-organization.
The Agile Manifesto nailed all this down pretty well, ten years ago, when its signatories wrote its four values:
- Individuals and Interactions over Processes and Tools
- Working software over comprehensive documentation
- Customer collaboration over contract negotiation
- Responding to change over following a plan
Such values are much more adequate to an industry whose people collaboratively and heuristically develop intangible products of the human intellect.
What the Manifesto doesn't specify is how to actually implement those values, especially in organizations that come from many years of Taylor-like processes and where the cultural change is substantial.
Fear not, this and much more will be the subject of upcoming posts.