One of my tech friends transformed his software career from manual testing to big data 2 years ago. Everything was great: Salary got 2x, mortgages got relaxed due to quick liquidity of stock options.
But the cherry on top of the cream was extremely liberating work culture — compared to sweatshops he worked before.
He deserved it. His struggle involved sleepless nights and weekends going through certification videos, practicing assignments and updating CV. He sensed the demand in time, grew anxious with preparations, and had extremely difficult time juggling interviews with his job.
But in the end, it all paid off. And I was happy for him.
Six months later, he messages: “Looking for a change…”
“ You just entered a hot industry…has the boss changed?”
“No — everything is cool as before. It’s just me. Don’t enjoy the work. Plus, just look at how much cloud developers earn.”
While I couldn’t understand his motivation, there was no sin wishing to earn best bucks for one’s talent. “OK, what now?”
“I will get a cloud certification.”
“Agree that cloud is hot and evergreen. But it’s not exactly in the certification gold rush era that you are looking for. That time is over. You need to show real stuff on github to even get a cloud interview.”
“So where is the certification gold rush right now? I don’t want this complex github stuff.”
My answer was ready in my head, but I was put off by his last statement.
“OK, whenever I spot that gold rush, I will test it first myself 🙂, and then share the results with you. For now, just use your big data experience to apply in bigger companies.” Let him learn the ropes, the hard way.
“Sounds good. But I want you to review my CV. I got no response from 10+ employers so far. Maybe, 6 months experience doesn’t count enough. The usual bullshit of numbers…”
I myself wasn’t even a beginner in Big Data. But he had huge faith in my CV review capabilities — something that I no longer liked to do as I grew gray-haired.
My chats with him told me that some real Big Data expert could easily see his worth, which was more than what he was paid for.
I started reading his CV:
- Experience designing end to end Hadoop infrastructure with MapReduce and Spark
- Experience working with MapReduce program for Hadoop
- Experience monitoring Hadoop infra using Ganglia.
Company A (June 2017–Dec 2017)
- Analyzing Hadoop Cluster using Pig, Hive, Spark, Scala, Sqoop
- Used Spark API over Hadoop YARN
- Developed Map Reduce programs for applying business Rules for Data
I left reading after half page. It was nothing but output of a word-adjective-generator spread across five pages, a spinoff of non-big data skills. While I believed it really hid some technical complexity, the CV made it look like it was all the same, nothing worth programmer’s salt.
Being the Big Data illiterate myself, some more research was needed. Basic google search on Big Data CVs revealed more daunting examples.
Everyone was doing the same work!
And it’s not just about Big Data, even my resume as a Windows Developer few years back was no different.
- Analyzing Requirements.
- Creating UML diagrams for design.
- Creating SQL queries to fetch data from database.
- Implemented logging.
- Developed Windows application using C++ that handles user events, presents data in the UI and saves user-edited data.
I cringed at every line: which published app on earth didn’t do those jobs, and how mine was different. Was I reading a job description scribbled by a non-tech recruiter?
I started wrestle mentally against reality.
CVs don’t tell how good we are. Because employers never have challenging projects.
Interviews don’t tell how good we are. Interviewers always suck. And come on, with all the fanfare, it still is a buyers’ industry.
But that doesn’t change the truth:
The biggest problem that developers never address is: they themselves don’t tell how good they are.
(and with most of them, there is rarely anything to tell. Trust me. 😉)
And this makes perfect business case for recruiters / employers to put wheat and chaff together.
So, is this all about a better CV format?
Not at all.
We are living in times when people are willing to pay unthinkable sums on resume formatting which brings limited, unpredictable value in short period of time.
A programmer can go far beyond what is cited on CV pages. And this is not a productivity rant that you hear from your boss everyday, but a cry of passion from within yourself that you decide to ignore-
- When you send that boilerplate CV to fifty employers.
- When you compulsively write copy-paste code, and be proud to match those stupid skill descriptions in your CV. (A LEGO exercise is far more creative than that).
So the CV needs an uplift, right?
Yes, but not in terms of words / format. But in terms of content.
It’s an overhaul, not an uplift.
Could Google exist in 1960s?
Internet did, during the cold war. Even in 1937 just before WWII , H. G. Wells, author of Time Machine, had already declared World Brain (an idea resembling www) as the most vital need to unify knowledge and pacify quarreling nations. It died premature death due to WWII.
Soon after the WWII, USA became wild west for the electronic and computing research. Government and industry were in an unprecedented rivalry to get custody of newborn babies of science.
Programmer wasn’t a job title. Every new scientific application required a novel computer design, and the designer was responsible for writing necessary software, because software was the manual, not the product.
New research created the obvious problem: Information overload. Postwar America needed classification system for exponentially exploding patents. An IBM engineer named Hans Peter Luhn filed patent in 1954 for a machine that could verify credit card numbers.
An entire machine to verify single checksum digit!
(Before you discard it as an overkill, make sure you practice this checksum algorithm for your next technical interview 😉 . Low ambition isn’t a crime if it feeds our mouths.)
Luhn made the real breakthrough in 1958, when he designed a machine that, upon inputting a text article, produced its abstract.
The technique was to:
- Collect the most frequently utilized words (minus words like is, the, for etc)
- Map them against sentences (context) they were used in.
The technique is called Keywords in Context, and is perhaps the most effective universal knowledge boost, past Gutenberg.
In today’s world, KWIC is omnipresent: search keywords highlighted along with sentences they appear in. In fact, its absence before 1960s is unbelievable.
Back to Luhn’s invention: Verification led to classification, the idea being: Data itself could become its own key, to unlock a bucket.
We know that key as Hash. And everything from password to bitcoin depends on Hash.
Not without reason, Luhn was christened as the father of information retrieval. KWIC is what Google does. But Google’s creativity lies in applying Pagerank + advertising on top of hash.
But why are we talking about Advanced Computing and Creativity in the same article?
Because of the deceptive simplicity of hash.
Not very much unlike Monalisa’s smile. And art requires creativity.
Hash is simple to get immersed into, because of its real life bucket-distribution analogy. Utility notwithstanding, following statement looks much more elegant and safer (we hold the element by the key!):
dictionary[“key”] = element
It is deceptive, because evenly distributing data among several buckets is a programming challenge that no Google aspirant can afford to miss.
But why is it important to be creative so suddenly?
In 60s, programmers were not in demand, because computers were close to non-existence. But engineers were needed, precisely because of the same reason: designing products required pen and paper. And minds.
In near future, this maybe true again for programmers, because computers do exist, in abundance. And they have minds. Programmers will be required, but will remain highly undervalued.
Creation of newer products that continually require human intelligence (not physical intervention) will be far more crucial in future where robotics and data science rule the world.
OK, but we don’t see it yet:
Understandable. But what you can see now is this:
- Lower (<10%) response rates to your CVs since the time every HR started using emails despite buying that expensive resume writing service
- Lower visibility on product roadmap of your company, a.k.a. no one listens to me in our product design team meetings problem
- Depleting CTRs to your website, facebook posts, youtube streams, instagram feeds despite hiring best content creators that your pockets can afford (Before you nail that round # 0 from startup angel)
But creativity can’t solve those problems
Instead, tell us to use eye-candy resumes and cutting-edge cover letters.
On the contrary, Creativity is:
Over the course of the last decade, however, we seem to have reached a general agreement that creativity involves the production of novel, useful products
- Michael Mumford, a psychologist in creativity research, University of Oklahoma
Accordingly, CV formation is not a demonstration of creativity worth a programmer’s brain power. (creation of eye-candy online resume formatting service could qualify though 😉)
The very fact that programmers can’t think of creativity beyond those fixations proves it is substantially missing. That is precisely why so many genius devs remain lifetime implementors, and so many mediocre devs make exits, are power holders, highly paid product managers / millionaire entrepreneurs.
If you think of the all the milestones where software changed human lives, it was creative programmer at work, just like Luhn. And far beyond cool UI or smooth UX.
Once you see it, it is hard to unsee the role of creativity:
- How Google changed email client from message-centric to conversation-centric software
- How Apple redefined computing with elegant hardware design combined with cool software design, and later replicated it for mobile experience.
- How torrent sites altered the way information is distributed (also paving the way for blockchain, decentralizing data ownership)
- How merchant services such as Paypal and Stripe made it easier to receive money for developers worldwide without relying on archaic Electronic Funds Transfer by banks.
- How Youtube singlehandedly marginalized satellite channels in the living room
Yes, but you mentioned my 3 problems, what about them?
Here goes that problem-list again, along with possible solutions.
- Lower response rates to your CV: Never aim to address it just by using eye-candy resumes, unless you are a UI Designer. Or an entry level developer. Or someone just dying to get in. (nice try though — no hiring made —but take your chance). Instead, fatten your Github (links in your CVs) — and in a meaningful way without static code dump. Fork and follow your favorite repos, make Pull Requests, fail, raise issues, fail again, succeed, get your hands dirty, make your handle known.
- Unhighlight the obvious. “Using SQL to fetch data” really sucks, unless you are the creator / first utilizer of SQL. If your app does everything obvious programmatically (i.e. fetching Data from DB and internet, allowing user entry etc), simply state what it does functionally in single sentence.
- (Ewww…was that creativity? I thought it was toil! 👍, just wait. Creativity isn’t about doing all things different, just one that your competitors are unaware of. BTW, rest are mandatory. 😮)
- Don’t ever limit your dreams to Big N (FAAMG+). On top of those forks and PRs — try originality — create apps, websites, frameworks, SDKs that the wider population can use freely. Once tested, start selling them forever, or one time. It could be a completely novel experience like Pokemon, or an old gem with a twist of powerful gamification like Quizup. Before you start making your great product, read creation story of Instagram. Read publications of Pinterest. How Youtube understands our videos. And how Quora handles its task queue. What are Netflix’s design principles in data science. After grasping them, publish your own versions even though they are tiny implementations of those megaliths. Move slowly, but force everyone to take notice, while you are still at your current job, instead of actively targeting Big N (FAAMG+) and disappoint royally.
- Lower visibility on your company product roadmap: For your inner calling’s sake, read, read and read laymen’s articles, on top of technology tutorials. Some priceless gems: how Node.js was created, why Java rules the popularity chart, What made Chrome fastest browser upon its launch, What made Amazon adopt microservices architecture, Whatsapp architecture, What is the Most sophisticated piece of software ever written — the list is endless. Content? Priceless. And yes, benefits of such reading are never limited to elevated esteem just within your circle. You never know whom you meet in your next Thursday beer party.
- Depleting CTRs to your marketing campaigns: Creativity in marketing can only bring growth if you are already sizable, having some convinced customer base. If not, creativity in product design / customer communication is your only sledgehammer.
- The gist is to stop using your dictionary and start getting into your customers’ mindset. Know their problems. Know them firsthand. Addressing them in your punchlines (website meta-tags and blog post excerpts) converts best. If you cannot address them, it is high time to go back to product redesign. You cannot do better if you are not better.
As personal creativity is always subjective, above are just examples.
Creativity can be studied by reading books, and you should surely read about it. But cultivation of creativity (after admitting you need it) surely requires considerable practice, along with tangible starting points.
Since Luhn’s Hash invention, Human race has come quite far — from almost no computational devices to mammoth automation industries. Humans, not just programmers, need creativity, not to win the rat race, but to remain relevant.
If not for themselves, for the generations to come, which will hopefully never see or draft those horrible Hadoop CVs for a raise.