Approaches for Enabling Teams
One of the 4 team types in the Team Topologies book is an Enabling Team which I have summarised here simply as being: a team designed help other teams to improve. Ways of Working form one of our enabling teams at Nationwide. Organisations can form enabling teams in-house and/or pay “reassuringly expensive?” consultant experts for anything up to £6250 per day!
What I’ve done here is described and appraised some of the “plays” that enabling teams (or if you prefer “consulting teams”) can perform. For each approach I’ll cover: what is typically done and why, an example, and some risks.
This is far from comprehensive, but I still hope it will help drive critical thinking and more effective usage of the different approaches, and maybe also some happier, more effective enabling teams.
Methodology audit-driven
If there is a formal methodology that you believe promotes the right behaviours, you may simply want to audit compliance to it.
You could use an internal methodology, or something external like ITIL or CMMI.
Risks:
- There is a risk that you uncover compliance to the letter of the methodology but not an alignment and commitment to the intentions. For example completely ineffective unit tests delivered within the mandated code coverage level.
- Possibly the methodology isn’t in all cases as effective as it is intended to be…
Best practice-driven
Whilst less formal than a full-blown methodology, this is also a very opinionated approach to advocating specific processes, systems, or rituals.
An example might be helping implement a Kanban board with the intention of making work more visible and limiting work in progress.
Risks:
- Best practices are never universally applicable and effective. You run the risk of mis-applying an idea. For example if the domain involves a lot of “one way door” decisions where change is costly, short sprint-based planning horizons need to be balanced with up front planning of things that need more deliberation.
- There is a risk that you do improve something but to the detriment of another part of the system. For example automated testing so that a team can be ready to release more regularly is great, but if your production configuration management system is already creaking at the seams, more frequent releases could become very risky to live service.
- If new practices don’t replace existing work, they add overall complexity. The cost of this complexity plus the opportunity cost of implementing ideas may outweigh the benefits. For example if a team adopts sprint reviews without stopping any pre-existing milestone meetings.
Doctrine-driven
This involves operating as an education function to promote awareness of all principles and ideas that you believe to be effective — with the expectation that “enlightenment” will drive improvement.
For example teaching about agile manifesto principles or the Wardley Doctrines in isolation.
Risks:
- By introducing “new” principles you change people’s priorities, values and thought processes. As well as doing good, this may also weaken the positive effect of some of their pre-existing approaches. For example in pursuit of optimising agility to make changes, a team may become less effective at ensuring a believable business case still exists.
- The “new” principles may not lead to positive impacts. Perhaps this is because they get misunderstood or people don’t just know how to implement them safely. For example a team may prioritise for “delivering working software” at the cost of their ability to manage technical debt.
Detailed maturity assessment and targeted best practice-driven
This one is a classic! Someone synthesises their experiences and a lot of fashionable supposedly “universal best practices” into a very detailed questionnaire. A “maturity assessment” is then conducted (often requiring numerous people to help complete the questionnaire). Everyone always scores somewhere around “lagging” to “average” and it is made painfully obvious how far their “maturity” is away from “leading” or “state of the art”. This is then turned into a shopping list of best practices that should be implemented, for example “automate some security testing”.
The intellectual property in DevOps maturity assessment tools are now worth less than a potential sales prospect - so many companies “give them away” to anyone willing to share their work email address.
Risks:
- A lot has been written about the risks of this approach for example here. I’ll avoid delving in and just summarise the biggest risk as I see it: you end up with a lot of very appealing tactics with very little idea about how effective they will be in your context or how to get started safely.
Pitstop for some perspective…
I ordered the approaches that I’ve covered so far so that each one has increasing coverage of the OODA (Observe, Orientate, Decide, Act) loop (see below).
As you can see everything I’ve covered so far (coloured red through to green) is very solution-oriented — “here is a method/practice/idea that you absolutely must need, you just need to…”
When Simon Wardley elaborates on the OODA loop further from a strategy perspective here, he espouses the value of thorough observation and orientation. He provides detailed guidance on how to appreciate where you are today and what you might anticipate could happen in the future.
I shall now explore two approaches with a much greater emphasis on observation.
Value stream appraisal-driven
This one is popular and for good reason. You need to choose a function that part of the organisation performs which delivers value either to customers or another part of the organisation. You then perform a detailed mapping of everything it takes to deliver one “unit of value” end-to-end. This helps bring people together via a common mental model about how things actually work. This can then be used to observe opportunities for improvement.
Value stream mapping is a very popular approach. I also love this approach described by Patrick Debois which he calls Monitoring Stream Mapping (you start with production).
Risks:
- This is well suited to repeatable processes. If the thing you want to improve is more variable and organic, there is a risk you map something that won’t happen the same way next time — rendering your observations useless.
Value opportunity-driven
I’ve saved this until last because I think it has the most potential and I’m genuinely excited about it. In a nutshell this approach is about first discovering the business outcomes you want to affect and then working backwards.
So how do we reason with our current performance, risks, and opportunities?
We all draw inspiration from 20th century manufacturing. Whether that be Lean from Toyota or ideas such as the Theory of Constraints by Eliyahu M. Goldratt in The Goal. In The Goal the business performance problems were clear and well-understood. The factory was unable to ship finished parts on time and customers were unhappy. I feel that often in our knowledge driven domains, the problem is not quite so clear. Possibly a line of business has a status report or some kind of balanced scorecard and this could be a great place to start but does it really inform you of the biggest priority / opportunity?
Think about a team you are familiar with right now, is their primary issue: quality, operational cost, operational risk, time to market? What are you actually trying to improve from a business outcome perspective? (Even medium-sized teams will probably have different answers for different services they offer.) I believe this thought process is often overlooked as we jump to secondary factors like trying to improve point processes or even improve colleague experience.
I have looked for examples of approaches of appraising team performance from a value perspective. The vast majority of guides I found dealt with secondary properties. How well does a team make decisions? How aligned to a vision are a team? These are all great things to explore but they generally aren’t the things we actually take to the bank. I’m not for example disagreeing with the value of psychological safety, but ultimately I think it is only an ingredient that can “contribute” to success.
The best example I can find is Wardley Mapping. It is a highly evolved approach to forming strategies and therefore things that enabling teams can aspire to assist with!
Risks:
- Some stakeholders may be impatient with such an analytical approach and maybe they are sometimes right. Perhaps some opportunities for improvement are obvious and don’t need detailed existential analysis.
- If this analysis forms the wrong conclusions, it is worse than not doing any!
WRAP UP
Hopefully this has been a useful tour through some enabling approaches and will help you think more critically about different methods enabling teams can attempt.