The big deal about small stuff ?

Do you think big is better ?

I guess it depends on the context. But there is even more to it then context. Waterfall and agile look at the economics of batch sizes from two conflicting paradigms and come to very different conclusions. Even in the exact same business context.

Why does size matter ?

Classical management favor LARGE batches and see them as a way to optimize efficiency and get the benefits of economies of scale.

A batch is a collection of “stuff” which is treated as one unit with respect to a process.

Let’s explore the concept and the two paradigms in a simple process most of us recognise. Let’s say it’s your birthday and you decide to bake a cake. ….I know… I know…baking cakes and doing sophisticated product development is two very different things, but bare with me. It turns out that the same principles are true for highway traffic, internet packages, cake baking and product development work floating through a company. Looking at the principles first in a simple domain helps highlight the problems of the conflicting ideas when we bring it to our target domain later.

Let’s try that again, but this time we do something really crazy.

Instead of putting them all in the oven at once, we create them one at a time. So for each small cake we pick materials for only that small cake, then mix materials for that cake only, and so on. Then do all the steps again, for each cake until all ten cakes are done.

One would be crazy to go for the small batch approach right ?

Well.….this is the classical way of thinking about batches. With this thinking the conclusion is clear. If we want to be efficient and fast, we should increase the batch size and push as much cake as possible in that oven everytime we get the chance.

So how come Lean and Agile try to convince us to favor small batch sizes ?

The most important blind spot in classical thinking, is that we assumes the cost we incur everytime we run each batch through the steps is static. Lean calls this cost the transaction cost.

The overall econimical outcomes can become better with smaller batches.

In the cake example we would have to invest in automation and change how the steps in the process is done, to make it feasible to bake small cakes in a single piece flow.

Kind of the same idea as building an assembly line.

We look for opportunities to reduce the transaction cost, until it doesn’t really matter economically if we choice to bake fever big or many small cakes. The only difference would be the speed of the assembly line as the cakes slide through the long owen. Bigger cakes need more heat (work), thus go slower.

All right, enough about cakes. How does this translate to product development ?

In product development our batch is a piece of functionality in the product — a product feature. The smaller we slice the features, the smaller the batch size will be. A feature we spend many hours building is a large batch.

What is it we have to do everytime we deliver a feature, no matter the size of the feature?

Think about that for a second. What would it be in your context?

Why is all that batch size reduction worth the trouble anyway?

Small batches influence outcomes positively through:

  • Increased adaptability
  • Reduction in time to market
  • Increased feedback
  • Reduction in Risk
  • Increased efficiency
  • Increased motivation
  • Increased feeling of urgency
  • Increased transparency
  • Increased trust
  • Increased predictability and planning ability
  • Reduction in invisible inventory
  • Leveling effect on competence demand
  • Increased ability to prioritize work

Some of the above effect are suprising to people who are used to working with large batch processes.

The management overhead cost for instance. If we do one project we need to initiate, approve, do reporting on and manage that project. If we do the same project scope, but split the project in ten 1/10 sized projects, the total management cost must go up, right?

Can management cost really decrease, when we split a big project into 10 small ones? How come ?

Let’s look at the test manager as an example. In a typical waterfall project the test manager would manage and analyse product quality risks, manage test cases, create test plans and manage the test process.

That management overhead is 100% zeroised.

What is done instead is that the quality is build in.

Our economics are no longer influenced significantly by how big the feature/batch is.

The automated test system does not come for free — far from it — just as a cake assembly line doesn’t. Lean suggest we invest in our product development infrastructure to make our problem look more like the assembly line then the single huge owen case.

This way we can enjoy all the benefits listed above and improve our total economic outcome.

Big batches should not be seen as an isolated problem in the development department.

The whole company is part of the “assembly line”, or “value stream” to use a Lean term, so we need to look at all the areas where big batches are usually found in the search for opportunities to decrease transaction costs and then batch size:

  • Analysis
  • Funding
  • Design
  • Purchasing
  • Prototyping
  • Testing
  • Management reviews
  • Tasks for specialized central resources

Agile coach, F16 pilot, owner of several startups. Love dealing with problems related to collaboration in complex and uncertain environments.