Should we always use the cutting-edge technology?

Nikolay Stanev
3 min readOct 27, 2017

--

When it comes to software architecture, whatever the case, usually the correct answer is — ‘It depends’. And as far as cutting-edge technologies are concerned, it really depends.
I guess every one of us has been tempted to use this new design pattern, that new fancy framework or the new almighty technology stack that has just been released. Not all of those experiments were successful, but not all of them were a complete disaster. Here are some lessons I have learned throughout my career.

More often than not, performance matters. And performance has at least two aspects — performance of the product you deliver, and performance of the team. In both cases, the effect of cutting-edge technology is negative. When you try something new, it is so new that sometimes you could consider it unfinished. And it is not only the bitter taste of using V1 things or the learning curve. Often you stumble upon features not behaving as promised, undocumented side-effects or even undocumented functionalities that work, but you don’t know how. The impact on the deadlines is palpable. And in the end, when all the obstacles are overcome and the product is deployed in production, an avalanche of new side-effects, not discovered during the development process, starts. Don’t get me wrong, this is not the rule, but unfortunately, it is not an exception either.

On the other hand, trying something new is an intrinsic trait of every software developer. In my opinion, switching from Cobol to C in the near past was a giant leap in the software development evolution — as profound as the move of our pre-predecessors from water to land. As a consequence, working with the latest technology is an important part of the motivation package we offer when we hire. Moreover, working with an obsolete stack is one of the biggest reasons people quit their jobs.

Even though working with some brand new technology is an enticing intellectual challenge, there are a plenty of cases where we end up into something which community somehow avoided to adopt, something people delicately call dead-end technology. Do you remember Silverlight, JavaFX or Windows Phone? How would you support your product later on? Where would you find new team members? I gave the examples above to remind you that even the biggest in the industry may sometimes fail, hence even the company behind this brand-new-stuff-you-want-to-use should not be considered as a warranty for further adoption.

To mitigate the aforementioned issues, I prefer to stick to the evolutionary approach. Initially, I apply cutting-edge stuff only to a small subset of the product functionality(or when the project is small enough) and if it has proved successful I would start extending the usage.

And the most important part — if it was a wrong choice, remove any single trace in the project. Then start over with the new-thing-that-will-change-the-way-software-is-developed.

--

--

Nikolay Stanev

Software Architect, Manager, and Developer with interests in Machine Learning, Data Science and AI. https://www.linkedin.com/in/nikolaystanev/