Hello, and thank you for having me.
First and foremost, let me express my deepest appreciation to Forty Two for boldly shifting the focus of education in IT back to where it should have always been: the fundamentals of software engineering.
In late 2016, Oliver Belanger and Charles Summers gave me the opportunity to explore how the organization operates at heart. Seeing the approach from the inside only strengthened my belief in your mission. I am grateful for the chance to talk to you today, and please kindly let me say a few words in support of Forty Two’s vision.
It is becoming increasingly apparent that we live in the world of power law. The power law assumes exponential distribution of the ratio of impact to effort, with its most famous formulation being the Pareto principle: twenty percent of people drink eighty percent of champagne. In IT, the state of the art for those who fearlessly go to explore computer science and software engineering is very different from the “market” of kickstarting one’s career by landing a job as a developer. Having spent the past ten years in the industry, I myself can very well feel it requires more and more determination to stay sane in today’s polarized world, where it may seem like every other door in San Francisco is either a shop hiring bootcamp graduates, or a bootcamp itself.
Good news is: Tech is swinging back to where it came from. Having the solid foundation of the basics of how computers do what they do is cool again. And Forty Two embodies it perfectly.
I interviewed hundreds of developers in my life. Unless one’s goal really is to get an okay offer to accept as quickly as possible, there is evidence to back the claim bootcamps and hackathons do everything but the most important part: explain what computer programming really is about. And while online courses chime in to fill the gap, the desire to be mentored by seasoned engineers is, from what I can see, at all time high. Today, most junior candidates appear to cite being mentored by industry experts as the number one trait they want to see in the team they look forward to joining.
And while it does give me immense pleasure to see aspiring younger generation’s desire to understand computers the way they are, I think it is also a worrisome wake-up call.
Today, the technology stack is changing faster than ever. But, even within the IT, keeping the power law in mind, many seem to be in denial about it. Somewhat paradoxically, in many work environments neither tools, nor processes, nor mindsets change quickly enough to catch up with the pace of technology. Given these environments — often times large enterprises — happen to employ a lion share of software engineering workforce, it is indeed easy for one to look around and believe their methods and skills are up to date and will remain in demand.
But it is not the case, as one quickly discovers looking outside their comfort zone, especially considering projects or companies of even slightly different nature. Technology waves in software engineering are too powerful for us to only react retroactively. If you’re into web frontends, React is the way to go, even if you don’t necessarily need all its features. While PHP and Apache are here to stay for quite some time, most people won’t consider them for a project they are starting today. If you’re into data analytics, you will likely be using a streaming no-SQL database, or even a graph database with materialized views, although the interface to query the data could still be the good old SQL. If you claim the knowledge of certain programming language, outside BigCo-s, concepts and data structures introduced in the most recent standard would likely be in heavy use in the next shop you go to. Even functional programming, of which many could say it’s still in the “slowly dragging behind” mode, is more and more important, if not when putting new features into code, then certainly in understanding modern concepts such as immutability, persisted state, or actor model data crunching.
The fundamentals always stay the same though. I still use Linux, grep, vim, and make. My bash usage patterns remain the same, except for perhaps introducing `jq` into the mix. O(N) algorithms will keep beating the O(N*logN) ones for large inputs until the end of at least the Turing-machine computing times. And, yes, as a friend of mine jokes, any real-life problem can still be solved with a savvy combination of hash maps and the Monte Carlo method. Some fifteen years ago I stumbled upon a good line in job requirement (which by itself has probably been another twenty years old): “The candidate must possess solid understanding of Unix, and a good taste to not consider the above an accomplishment”. As of today, this motto does not seem to get old at all.
The approach I consider the right one and bless wholeheartedly is what Forty Two does. Find energized, hungry, and motivated people who have the mindset that makes you genuinely curious to get to the very bottom of what happens under the hood. Gather you guys under the same roof. Offer an intense crash course, combining careful guidance with frequently leaving you on your own. Promote communication. And observe the results.
The knowledge of LAMP or MEAN stacks does not survive the time test. The knowledge of the fundamentals sure does. And Forty Two is invested into making all of you learn the latter.
I am a firm believer in substance over form. In software engineering, it so happens that the substance is virtually always right at our fingertips, often just a few taps away. To me, as well as to the best engineers I know, it’s an embarrassment to not look into something that seems to behave unpredictably; and, trust me, we do it all the times. Yet, more and more people seem to struggle to name a Unix command line tool that would use DFS, BFS, or topological sort. And, while legitimately understanding complexity, I often get a blank stare asking how long would bubble-sorting a million integers take on a modern MacBook. It is a dangerous trend, and I’m very optimistic Forty Two is on track to reverse it.
On a closing note, I do see the world making the full circle once again. In the late 20th century, the software industry has been dominated by nerds with values close to the ones of Sheldon Cooper, and those nerds, yours truly somewhat included, were possessed by nothing but computers. Coincidentally, we happened to be those magicians corporations went out of their ways to hire. After that “era”, in the early 21st century, we lived through the correction of the market, when it turned out very many roles are perfectly fit for Howard Wolowitzes, and assuming “experimental” can be above or on par with “theoretical” is no longer a red flag when it comes to the industry. Unfortunately for all of us who prefer to get things done and not talk about them, the correction did go too far, and we’re now back to the brave new world where the knowledge of the substance of what computers do and how do they do it is again understood to be more important than the knowledge of the form of how to put together a simple web service or mobile app.
The future is brilliant. Cars already drive themselves. We will have augmented and virtual realities. We can no longer count by fingers the number of fields where AI has surpassed humans in cognition and perception. We can edit human DNA live, and our medical devices, of size that can get be implanted into our organs, are computationally more powerful than the machines our grandparents landed on other planets more than half a century ago.
It would be insane to not assume most of you guys will be working in or next to some of these fascinating industries. And it would as insane to believe having shallow hands-down experience with the MEAN stack could be made use of.
For the Knowledge we go.