I started off my career in IT as a tester or test manager and I’ve had twenty odd years of working around the world on companies big and small.
Just when I thought the IT industry had stagnated and was subsiding into a morass of blunt instruments and simplistic aphorisms — along came a paradigm shift, the like of which I believe we have never seen before.
Cheap storage and virtualisation led to the cloud revolution which led to continuous integration which led to devops… well it hasn’t been quite like that, but you get the idea. Throw in Big Data, the Internet-of-Things, Artificial Intelligence and Automation and we’re in for an interesting decade.
I’m not sure this revolution even has a name yet, but it will — the Golden Age of IT perhaps? (with a nod to Carlota Perez). All ‘technologies’ go through cycles of frenzied surge and then softening maturity where they become more ubiquitous and useful.
From “Technological Revolutions and Financial Capital” by Carlota Perez
It’s not that IT has become a commodity as some would suggest, we’re not there yet. It’s just that the commoditisation has moved up the IT stack to the point where value is being differentiated in new and interesting ways.
Commodities have a number of properties; they are fungible (you can replace one with another), they tend to be inelastic (their price doesn’t vary much with demand) and the premium to acquire them drops drastically as the barriers to access them are removed.
Once upon a time purchasing IT infrastructure was a complex and expensive exercise that involved long lead times. The investment in purchasing and deploying infrastructure meant that you had to choose your infrastructure carefully because of the cost (both when you acquire it and over its lifecycle).
Now you can literally spin up a server or a hundred servers at the click of a button and you pay for only the time you use. Virtualisation and containerisation mean that you can deploy a staggering combination of operating systems and databases with another click. And microservices and serverless are simplifying middleware. That just leaves the application layer.
In a lot of ways the recent developments in in software mirror my own career.
In 1997, with colleagues from Curtin University in Perth, I helped publish a book on incremental and iterative development. Some of the ideas we espoused shaped my own ideas of software development: rapid, evolutionary and value driven.
But after that, I spent the next fifteen years of my working life as a test manager in the wilderness, trying to hold back the a tide of excrement flowing down the waterfall. It was a fruitless and often thankless task.
It wasn’t that I didn’t enjoy it or that I wasn’t successful — by all the benchmarks I’ve had a successful career — or that the projects I was involved in didn’t deliver any value, it’s just that I always felt it could be so much better. After the first heady days of the 80’s and 90’s, software development has always seemed far too hard and far too costly. The overhead of deploying a simple application has ranged from staggering to mind-boggling.
In one major Australian company I worked for I had to front the head of a major division to explain to him why changing the length of a text field in his CRM was going to cost a minimum of $100,000 to test. The best answer I had was “it’s complex, it’s risky”.
One of the major influences in my recent working life is from Lean and the Toyota Way. By learning first hand from Lean disciples in other industries I’ve been able to apply its principles to software development and learn a lot from the process. Lean is all about reducing overheads to allow you to focus on delivering value to your customers.
The confluence of cheap storage, cloud, automation and modern toolsets is setting the scene for a new stage of the information revolution. One where the effort invested returns the maximum value for the customer, not for the IT department.
One where software starts to deliver the value it always promised.
But be warned, this is not a new stage of expansion in IT, quite the contrary.
In about 2000 I read a paper from the UK that pointed out that IT had been growing at twice the rate of GDP for about ten years. If it continued for another ten, it would mean that everyone in Britain would be working in IT by 2010. No one would be making cars, growing food or working in shops. Clearly that couldn’t (and hasn’t) happened.
Instead we’ve seen a consolidation — the big players have gotten bigger, smaller players have proliferated and the middle tier is largely disappearing. The same has happened with platforms and technologies and even toolsets.
The consolidation will continue. It will extend into organisation roles and structures. The software development industry is going to see the same kind of disintermediation that Uber did to taxis or Airbnb to accommodation — and the world will ultimately benefit. Think of the third age of the industrial revolution where increased productivity through automation led to increasing specialisation but reduced labour.
And this new landscape demands a new kind of tester… or a new way of thinking about testing.
No longer can testing be a siloed activity, performed by nominated individuals in isolation from other activities in the SDLC. It must be an integrated part of the practice of software development (this was probably always true, but now siloes are becoming a definite constraint on the software value chain — hence the rise of DevOps).
In the job adverts and the trade press nowadays you often see a reference to “full stack developers” — testing must also become full-stack and it must be fast.
Testing must be an enabler for software development — not a constraint.
Many years ago, when I was working in the waterfall for those big companies, I wrote a ‘primer’ for aspiring testers around the world. A introductory guide on how and what to test, how to add value in software development. It’s been picked up in a few places and reproduced, mainly because I distributed it freely.
I’ve never updated it, until now.
I’ve written a new version of the Primer and I’ve tried to distil all I’ve learned from Lean and Devops, and Agile, into this version. The technical references in it will almost certainly be out of date before you read it, but that’s because the world is moving so fast; but the principles should remain the same.
This version of the Primer is for everyone who is interested in quality in software development, and that should be everyone; because delivering poor quality software at speed is a certified recipe for failure.
I hope you find the Primer useful. Let me know.
Originally published at ec2–13–54–198–24.ap-southeast-2.compute.amazonaws.com on March 7, 2017.