It’s that time again. Interview time. At the software consultancy I work at, we are always trying to figure out the best way to find the best people. It’s not easy. Even as a company, we share the same indecisiveness that individuals naturally have; we don’t always know exactly what we want. In this case, when it comes to potential candidates, the ongoing debate usually involves the priority and importance of a person's varying skills, experiences, and technical background. It’s actually an interesting topic of discussion with different opinions from all of our consultants. As we continue to discuss and figure that out, I recently wondered how I myself gained the skills that got me into the career I am in today. At least one thing was for sure: a number of those skills didn’t come from my prior schooling. But could they have?
I fortunately went to both community college and university to study software engineering and computer science, respectively. The commonly expressed distinction between these two types of educational institutions is that while the former tends to be more practical, the latter focuses mainly on the theoretical. I agree with this to a certain extent, but what concerns me is not how either of these differed, but rather what I feel they both similarly lacked. In my experience, both institution types didn’t fully provide the education needed to fully grasp what real-world software development is like.
For the most part, hands-on experience was paramount to actually learning what I eventually felt was missing during my time in school. It came down to learning from others — possibly team members with prior experience — or learning on my own. If you are involved in a computing or software-related career, this should be no surprise. It’s common knowledge that with a fast-paced field like this, constantly learning about technology, tools, practices, and philosophies is a day-to-day practice. However, I’ve encountered interview candidates who still don’t cover what I would consider core knowledge or experience. This led me to wonder if places like universities, colleges, and other educational institutions, would do better to help their students get a jump start on their possible futures in industrial careers by covering some of these core areas.
I am not totally inexperienced with the concepts of teaching at the tertiary level; I did some lecturing and tutorial assistance during my postgraduate years. However, it’s only now that I consider my personal input relevant enough to make a suggestion with higher education institutions — namely my alma mater of Western Sydney University and Mohawk College — to update their computing and software curriculums to include topics I think would greatly benefit graduate (or junior) developers. Even with the time passed since I was a student, the curriculum still seems to be lacking in some areas. That’s not to say they haven’t kept relevant though. For example, when I was a student, there wasn’t an elective course to study the theory and practice of good software testing — which is now available at Mohawk. But I still believe there are other gaps. What I additionally suggest are the following:
- Development operations — DevOps — primarily explaining how production software is actually put out there into use. This should also give a glimpse of how software is tested and treated internally in non-production environments. I think software deployment is also highly relevant. This would cover current practices like continuous integration and deployment, as well as other notable release patterns, automations, and lifecycles; software packaging across different languages and platforms (e.g. containerization); and proper application configuration — something I always felt was always missing from the numerous “hello world”-level programs and projects we made in school.
- Understanding user interfaces and experiences to a minimal level. Personally, I didn’t realise how important this was until I understood the importance of computer and software accessibility.
- In addition to testing — which hopefully is taught beyond simply unit testing — students should be exposed to concepts from quality assurance, especially from a business owner/stakeholder perspective. I’m not saying everyone should become a QA, but more so that students should gain an understanding and empathetic view with their purpose in the software development cycle.
- For the same reason software analysis and design is taught, actually having a curriculum around “agile” and other current methodologies would be useful. People pay considerable amounts of money to understand what “agile” actually entails, but I think even a light introduction to some of the techniques and practices that fall under this moniker that software developers use (or are forced to use) day-to-day would be initially sufficient.
- Generally, students should know how to provide support for the software they create. This includes writing documentation and troubleshooting help (e.g. Wiki’s, README’s). It also means understanding that you might one day be on-call for something you helped put into production — or may never had anything to do with. You should feel assured that users and future developers can continue to use and maintain your software. Consultants must especially be heedful of this.
I wasn’t 100% sure where to place this under, but I think giving insight into your software — for example metrics, messaging, and logging — also sits under the umbrella of support.
- Metrics don’t just provide support, but also help with performance analysis. A basic understanding of computational complexity will always be necessary to help software developers with understanding the ideal ways or tradeoffs to implement their code and data structures. However, in industry environments, working with large-scale and distributed services demands the need to know how to measure actual system functionality and to understand the importance of certain data (e.g. percentiles vs. mean). Beyond knowing the theory of performance analysis, students should also be exposed to the practical tools software developers rely on.
- I personally did get a lot of experience with debugging in school, but it felt ad hoc. I believe a short introduction to techniques and practices, especially in modern tools like IDE’s would be beneficial. This is probably especially true since I believe most people — especially those who started off as junior developers — agree the best way to understand a pre-existing codebase is to trawl through it.
- With the increase in entrepreneurship (and the correlated decrease in company sizes), it would be useful for students to get some exposure to lessons that cover this domain. For example, in addition to software analysis and design, understanding what MVP’s and POC’s are, when they should be created, modified, and/or thrown away.
How topics like these are taught is also up for debate. Some can easily be taught in a day. Others could take a semester. Like I stated earlier, this field is constantly changing and evolving. Saying that, schools need to keep pace with the same changes to give their future graduates the preparedness for what is currently considered real-world software. In the end, the biggest win is that software people gain more confidence in their field.
And maybe a slightly better impression during a job interview.