Standardization is the Death of Innovation

As a tech organization grows, it expands the number of teams. These teams are increasingly made up of members of different backgrounds, experience, and expertise. These teams also start solving very different problems. microservices increase this divergence, giving teams the freedom to solve problems in ways independent from other team’s code bases. This gives the team the ability to solve the problem in a way that is best suited to the problem and the team. This divergence will eventually start to make upper management uncomfortable. They will often start to push standardization initiatives. Standards can take the form of libraries, frameworks, infrastructure, such as databases, or any other of the many choices a developer makes when designing a solution. Often, these initiatives are framed as a benefit for developers. “When developers switch teams, they should not have to learn the new team’s technology from scratch”. Other times, it is framed as a benefit to Ops. “Ops is easier when every service deploys and runs the same way in production.” First, I will take down both of these arguments. Then I will suggest a better approach.

First, if you have an Ops department in charge of all services, you are doing it wrong. DevOps should go hand in hand with microservices. DevOps means each team is responsible for deploying and running their own services. The team who writes a service will never be concerned about whether their service deploys and runs differently than other team’s, because they only have to run their own service.

Making changing teams easier on developers is trying to solve a non-existent problem. Developers are always learning. They will in no way mind learning some new technology when changing teams. What they will mind is being told HOW to solve problems. Developers often are told WHAT problems to solve by a separate product team, which is bad enough. Now they are being told HOW to do it, and this will not go over well. Developers can’t properly do their jobs when they have no authority to solve problems in they way they see fit.

There are, of course, benefits to standards. Standards are a great way to save time when a problem has been solved, and you want to avoid re-inventing the wheel. This works when the standard is a good standard. When you force a standard on people, you have no way to know if it is a good or bad standard. A better way is to provide a standard, but do not mandate it. If it is good, you will see adoption according to the adoption curve:

https://ondigitalmarketing.com/learn/odm/foundations/5-customer-segments-technology-adoption/

Once the Late Majority adopts it, you have a pretty good idea that it is good, and you can feel justified in mandating it to the laggards. But sometimes, it will never make it that far. If not, that means it has some flaws, and often, another competing standard will pop up, created by others in your organization. This is good, because whichever of the two or more competing standards gains more adoption, is probably the better one. This way, you always end up with the best ideas your organization can offer. Also, sometimes it is good to keep multiple standards, because each may the best standard for a specific subset of your problems.

So the best thing an organization can do is foster ideas from the organization, and provide a way for a team to communicate its ideas to other teams. Make libraries, frameworks, and standards developed by a team available to all of the other teams, so they can assess the value of these things themselves, and make an informed decision on which they should adopt. Track the adoption of each, and once they reach critical mass, use that momentum to promote them to the laggards. You will find this much easier, will keep morale up, and end up with a stronger technology platform.