Skip to main content

World War 2, Rationalism and Agile Development

One of the central values of the Agile Manifesto is Responding to change over following a plan. This doesn't mean that it's not agile to make a plan. You should have a plan, but you should realize the limitations of the plan.

Are there any parallels to this in history?

The Soviet Union had 5-year plans. Stalin was quite succesful in industrializing the rural country in a short span of time. But his methods did not work well in the battlefield. In the 2nd world war, the Soviet suffered severe losses in the beginning.

You can plan how to build and manage a factory, because the complexity is limited. But a war is far more complex. You cannot predict what the enemy is going to do. This doesn't mean that you should go to war without a plan. Far from it, you should plan as much as possible, but be prepared to improvize when something unexpected happens. As Dwight D. Eisenhower, a successful American general in World War 2, said: "Plans are nothing; planning is everything."

Building software is also complex. But some managers try to run a software development organization as a factory. They treat their employees like cog-wheels. They think that Employee #1093 can take over the responsibilities of Employee #835 with just some training. Or that 4 programmers can do in 3 months what one programmer can do in 1 year. It doesn't work like that. If software development was so predictable, then robots could do it. But they can't.

I don't like being treated like a cog-wheel. I am not Employee #1093, I am me. I am unique with my own weaknesses and strengths. If I am treated like a replaceable part, I may start to behave like one. Then, I will only do as I'm told, nothing more, nothing less. If nobody listens to my ideas, I will stop thinking creatively, and just do things the way we always did.

Since I don't want to become mediocre, I avoid environments like that. I seek challenging environments where my ideas are heard and I am given responsibility to work as I think is best.

I think that Churchill, the Prime Minister of Great Britain during WW2, was agile too. He was not afraid to change. He said: "To improve is to change; to be perfect is to change often.

He also said, "However beautiful the strategy, you should occasionally look at the results." That sounds like iterative development to me. You make a plan, and then review the results now and then to see how it works. (See more fantastic quotes of Churchill here.)

So how about Hitler? How agile was he? Not agile at all. He had a great vision and a plan that seemed to work very well in the beginning. But as we all know, he failed. I think this is similar to a waterfall project: He was 80% finished with conquering Western Europe, and 75% finished with Eastern Europe. If he had run the war more iteratively, he should have completed Western Europe first, and then started on Eastern Europe. Thank God, he didn't!

What did Hitler and Stalin have in common? Both nazis and communists were rationalists with a strong belief in their own enlightened minds. I dare say that Churchill and Eisenhower had a more humble attitude, accepting that they could not make perfect plans. I would also say that Churchill was empiric in that he wanted to review the results of his plan.

Rationalists think they can plan everything. Empirists believes in experimental evidence. I think that waterfall development is a result of rationalist thinking, but it doesn't work well, because our minds are limited. We cannot predict everything, but we are capable of planning to a large degree. Therefore, I believe iterative development where you make a plan but also review the results regularly is the most effective way.

PS: I am not a historian nor a philosopher, so please, enlighten me if I got something wrong.

Comments

Jeff said…
Spot on. I think conventions are important insofar as they enable the leveraging of creativity among groups of individuals. You mention stuffy constraint laden shops as an example where these are pushed past their usefulness. I'd proffer that the manager in this example, and by extension the "rationalist" social planner, place their goals above the creative ambitions of their direct reports. Otherwise in what capacity are they the "boss"? They wouldn't turn over power to for that...

Popular posts from this blog

The problem with use cases

The greatest benefit I get from use cases is that they focus on the user. Use cases help me to think about what the user wants to do instead of only focusing on implementation details. The biggest problem I have with use cases is that they are not structured. They are basically free text. For instance, if we have a use case Withdraw money from ATM, we may define that it has a precondition that Open account is performed, but we don't get any help from the method to see that. What happens if someone later changes the Open account use case or defines a Close account use case? How do we find which other uses cases that need to be modified? We can look through the old use case diagrams and find dependencies, but I can almost guarrantee that these dependencies have not been maintained after they were initially created. The solution to this is to connect the use cases to an object model. I don't mean a use-case realization with view and controller objects like ATM_Screen and ATM

The Pessimistic Programmer

I decided to change the title of this blog to "The Pessimistic Programmer". Why? Am I a depressed person that thinks nothing will work? No, I am an optimist in life. Something good is going to happen today :-) But in programming, something will surely go wrong. I don't actually view this as pessimism, but as realism. I want to be prepared for the worst that can possibly happen. Hope for the best, prepare for the worst. But my wife explained to me that pessimists always say that they are just being realistic. So, I might as well face it: I am a pessimist. I think a good programmer needs to be pessimistic; always thinking about what can go wrong and how to prevent it. I don't say that I am a good programmer myself. No, I make far too many mistakes for that. But I have learnt how to manage my mistakes with testing and double checking. Über-programmers can manage well without being pessimistic. They have total overview of the code and all consequences of changes. But I

Use examples to make your code easier to understand

Programmers are used to abstract thinking. To program is to generalize: A method is a general specification of what to do during execution. A class is a general specification of objects. A superclass is a generalization of several classes. Altough our minds are capable of abstract thinking, concrete thinking is much easier, and concrete examples are the foundation for abstractions. For instance, when we were children, our parents didn't try to teach us about cars by explaining to us cars are and what they can do. Instead, they just pointed at a car that was driving by and said ”Look, a car!” When they had done that a number of times, we knew what a car was. Another example is prejudice. We all have prejudices, because this is the way our minds work. If we have met a few people from Denmark in our lives, and those people were friendly, we ”know” that Danes are friendly. And this works even stronger for negative prejudices. My point is that we learn by examples. Einstein said t