Choosing a Programming Language / Framework the Most Important part of a Project that Doesn’t Matter

Okay, talk about loaded title, but let me explain. Whenever we start up a new project especially with a group of individuals with a wide range of technology preferences and experience it can be borderline holy war.

Want a demonstration? Find a group of people using any technology be it .NET, PHP, Ruby on Rails, Python, Javascript, etc. Chime up a conversation and suggest the next project they tackle they should try something other than whatever their choice technology is. You’ll likely be either openly dismissed or met with hostility.

But why?

We Have to be Right

One of humanities greatest flaws is it’s need to be right. It’s lead to some of our worst tragedies and is one of the reasons why US politics are completely jacked up. You see people are so hard wired with the need to be right that often even in the face of irrefutable evidence people will still claim they are right see that belief as equal to proven fact.

As developers we are no exception. You pick any language or framework and call it “the best” you’re naïve or a liar. Every technology has it’s strengths and weaknesses, for you as an individual your choice might be the best for you right now on this specific project, but it’s unlikely it’ll be the same tomorrow.

With all that said when you imply a technology might be better suited for a task than whatever someone else believes, you’re challenging what their reality has determined to be true, which means you’re telling them they are wrong, which will often trigger a defensive and often hostile response.

The Truth is it Doesn’t Really Matter

I’ve seen entire teams grind to a halt in determining what technology to use to tackle a problem. I’ve seen developers waste countless hours because Technology A was “better” than Technology B according to a specific metric. In both cases projects failed because of poor decision making rather than just taking what works and running with it.

In over a decade doing development I’ve seen technologies used for things they were never meant to do. I’ve seen frameworks and languages that rank among the lowest in benchmarks working perfectly fine at scale. I’ve never seen a problem that stalled a project because the technology used by the project just couldn’t handle it.

Case and point, I actually enjoyed a comical session with Orlando Devs called “Frontpage Bestpage” where the speaker used Frontpage which is generally accepted as one of the worst things you could ever use to develop a meaningful website on and produced a functional website in a short session (even though about everything you could imagine go wrong did) The entire point of the session was, what technology use really isn’t actually that important, I mean ideally we want to be able to take full advantage of the technology, but frankly if it comes to a point your technology is your limitation you have the means to overcome it or your project has already failed.

Why bring this up?

I, like most developers, have had friends, coworkers, etc who have different technology preferences. I’ve heard countless times where one person berates the other “Why would you use that, it’s a blocking language!”, “More people use Tech A than Tech B”, “You shouldn’t use Tech B, Tech A is faster!”, etc.

I actually really enjoy a nice philosophical discussion about “what is better” it’s something that has no answer, but can be rather thought provoking with the right audience. The issue is how many people just draw their line in the sand “My choice tech is better than yours” and you give a counter example and they only respond with “It’s SOOOOOO bad” or “You’re wrong”. I’m okay with “you’re wrong” so long as it’s followed by “because…” and anything of substance. That’s how we learn from one another and get a better understanding of the bigger picture.

So what you’re saying is Just use whatever?

Well that’s not exactly true, there are times where you can make wrong choices. Case and point, if I want to create a native android app I probably am going to want to shy away from Cobol.

Essentially, you need to consider what technologies you know or are willing to learn, come up with a short list of pros/cons and based on that get started. Once you start don’t back pedal and switch unless you come across a limitation that you just can’t overcome (These limitations are actually extraordinarily rare)

The point is, obey Wheaton’s Law when discussing technology, and the technology really only matters in developer proficiencies and if the technology is capable of handling your need reasonably. (And we’re not talking scaling, early optimization kills projects, tackling scaling too early is a form of optimizing too early)