The big deal about small stuff ?

tomas eilsoe
8 min readMar 18, 2018

--

www.tomaseilsoe.com

Do you think big is better ?

I guess it depends on the context. But there is even more to it then context. Waterfall and agile look at the economics of batch sizes from two conflicting paradigms and come to very different conclusions. Even in the exact same business context.

Why does size matter ?

Classical management favor LARGE batches and see them as a way to optimize efficiency and get the benefits of economies of scale.

Lean and agile favor SMALL batches as the fundamental idea to optimize total economics and increase likelihood of successful outcomes.

While some organisations totally gets the agile paradigm of small batches others struggle. When organisations “go agile” but struggle reducing batch size, you end up with a dysfunctional organisational system with big batches executed with agile terminology, agile meetings and agile role names. But the agile promises of success remain - just a promise.

Interesting conflict of paradigms, so let’s explore the dilemma.

A batch is a collection of “stuff” which is treated as one unit with respect to a process.

Let’s explore the concept and the two paradigms in a simple process most of us recognise. Let’s say it’s your birthday and you decide to bake a cake. ….I know… I know…baking cakes and doing sophisticated product development is two very different things, but bare with me. It turns out that the same principles are true for highway traffic, internet packages, cake baking and product development work floating through a company. Looking at the principles first in a simple domain helps highlight the problems of the conflicting ideas when we bring it to our target domain later.

Still here :-) Great…All right, we consider that cake the batch. The size of the batch would be the amount of stuff it consist of . When baking our cake we would go through process steps like picking materials, mixing, baking it in the oven.

Imagine we use the same amount of stuff, but instead of baking one cake we decide to bake 10 small cakes instead.

Let’s try that again, but this time we do something really crazy.

Instead of putting them all in the oven at once, we create them one at a time. So for each small cake we pick materials for only that small cake, then mix materials for that cake only, and so on. Then do all the steps again, for each cake until all ten cakes are done.

When we create the 10 cakes this way, each small cake is a batch, and the size of the batch would be reduced to one tenth.

What would happen to the effort and duration when we bake 10 small cakes one at a time compared to one big cake ? It would probably increase 10 fold right ?

What about efficiency ? We pick materials and bake 10 times instead of just 1. How efficient is that?

One would be crazy to go for the small batch approach right ?

Well.….this is the classical way of thinking about batches. With this thinking the conclusion is clear. If we want to be efficient and fast, we should increase the batch size and push as much cake as possible in that oven everytime we get the chance.

A manager with a classical mental model about batch economics might think: Agile is trendy, colorful, nice to people and all. I can live with that and even the sticky notes, but I draw the line with small batches. Its clearly inefficient, expensive and time consuming.

There must be a limit to this economically irresponsible madness. Agile might work for others, but clearly that part of agile is not a fit for our specific context.

So how come Lean and Agile try to convince us to favor small batch sizes ?

The most important blind spot in classical thinking, is that we assumes the cost we incur everytime we run each batch through the steps is static. Lean calls this cost the transaction cost.

Lean teach us that the transaction costs can actually be reduced dramatically, when we have a lot of small repeatable transactions. The other blind spots are the significant economical benefits of small batches. These benefits are listed below.

The overall econimical outcomes can become better with smaller batches.

In the cake example we would have to invest in automation and change how the steps in the process is done, to make it feasible to bake small cakes in a single piece flow.

Kind of the same idea as building an assembly line.

We look for opportunities to reduce the transaction cost, until it doesn’t really matter economically if we choice to bake fever big or many small cakes. The only difference would be the speed of the assembly line as the cakes slide through the long owen. Bigger cakes need more heat (work), thus go slower.

All right, enough about cakes. How does this translate to product development ?

In product development our batch is a piece of functionality in the product — a product feature. The smaller we slice the features, the smaller the batch size will be. A feature we spend many hours building is a large batch.

The transaction costs are the costs we incur everytime we do a feature no matter how big that batch/feature is.

So we find the transaction costs by asking:

What is it we have to do everytime we deliver a feature, no matter the size of the feature?

Think about that for a second. What would it be in your context?

Regression tests are usually a big transaction cost in product development, but it varies from context to context what the transaction costs are and how they can be reduced.

In waterfall where we take all the functionality through each process step once, the whole project is one large batch. Like in the unautomated owen bakery.

In Agile we take a small portion of the total functionality (a small batch) and run it through all the process steps. Define, build, test, … To make this economically feasible we look for opportunities to reduce the transaction costs.

Does it make econimical sense to build in tiny batches, if our company only have a big Owen technology. It might, but probably not. We probably have to reduce batch size graduatly as we make it economically feasible, by building and improving an “assembly line” or parts of it. More on that below, but first:

Why is all that batch size reduction worth the trouble anyway?

Small batches influence outcomes positively through:

  • Reduction in management overhead costs
  • Increased adaptability
  • Reduction in time to market
  • Increased feedback
  • Reduction in Risk
  • Increased efficiency
  • Increased motivation
  • Increased feeling of urgency
  • Increased transparency
  • Increased trust
  • Increased predictability and planning ability
  • Reduction in invisible inventory
  • Leveling effect on competence demand
  • Increased ability to prioritize work

I will revisit and explain the above list in a future story, so follow me if you find that interesting.

Some of the above effect are suprising to people who are used to working with large batch processes.

The management overhead cost for instance. If we do one project we need to initiate, approve, do reporting on and manage that project. If we do the same project scope, but split the project in ten 1/10 sized projects, the total management cost must go up, right?

We would now need to run ten projects and get each initiated, approved, reported on ect. The cake story showed that Lean thinking says that, this mental model does not tell the complete story.

Can management cost really decrease, when we split a big project into 10 small ones? How come ?

Let’s look at the test manager as an example. In a typical waterfall project the test manager would manage and analyse product quality risks, manage test cases, create test plans and manage the test process.

In an fully Lean/Agile organisation we would expect to find zero test managers.

That management overhead is 100% zeroised.

What is done instead is that the quality is build in.

The regression tests are automated like an assembly line. The transaction cost of running a complete system regression test is ideally a push of a button at near zero cost and time.

There is no need for a test manager here, at least in the classical sense. We still need to think a lot about how we test, and some of the competences are needed, but now the management problem is more of a test system engineering problem.

Running a test is now almost FREE. Which is great ! Not because we save money on test (we probably don’t, at least not as a first order effect), but because:

Our economics are no longer influenced significantly by how big the feature/batch is.

The automated test system does not come for free — far from it — just as a cake assembly line doesn’t. Lean suggest we invest in our product development infrastructure to make our problem look more like the assembly line then the single huge owen case.

This way we can enjoy all the benefits listed above and improve our total economic outcome.

Big batches should not be seen as an isolated problem in the development department.

The whole company is part of the “assembly line”, or “value stream” to use a Lean term, so we need to look at all the areas where big batches are usually found in the search for opportunities to decrease transaction costs and then batch size:

  • Marketing
  • Analysis
  • Funding
  • Design
  • Purchasing
  • Prototyping
  • Testing
  • Management reviews
  • Tasks for specialized central resources

To inspire others, please comments below on how you invest in making smaller batches economically feasible?

Hope this story was helpful to you ! Follow me and learn about small batches, their benefits and tricks to reduce them in my next medium stories.

THINK ABOUT IT !! Do you recognize this ? please let us know in the comments !

If you found this story interesting, then hammer and hold that clap button for 10 seconds :-)

Want more ? Read my previous story: The biggest killer of agile transitions and how to cure it.

--

--

tomas eilsoe

Agile coach, F16 pilot, owner of several startups. Love dealing with problems related to collaboration in complex and uncertain environments.