The Origins of The Big Data Trend

Olga Kouzina
Quandoo
Published in
5 min readApr 26, 2019

… and why it implies a lot more than what we see on the surface.

I guess it wouldn’t be an exaggeration to say that software development is the most knowledge-intensive field of production there is. No other industry is relying so much on the balanced influences of society and technology. The ever-changing ecosystem of software development thrives on knowledge… or is it wisdom… or is it information… or is it Big Data? There might be some confusion here, because “Big Data” is a lot more trendy term these days than “knowledge” or “information”, not to mention “wisdom”. With so much emphasis on data, the other three options look greyed out. Little is being said on why organizations might or might not need it, and where this whole movement is rooted. Curiosity may have killed the cat, but it can for sure save many humans, so I took a deeper look into all things Big Data and came to some interesting conclusions. It just makes sense to note the points of departure and destination before running to catch the Big Data train.

Previously, I’ve been exploring how human needs and reactions triggered the rise of agile and then Kanban. The Big Data trend is tied to human-related incentives as well. To the shortage of some skills required to fulfill some needs, to be exact. Let’s look into what those needs and skills are. Fast backward to the early 80’s, or even 70’s. Not to go into too much detail, the trends that started prevailing in education around that time were shaped by the philosophy of pragmatism — and still are. I’m talking about the United States mostly. The summary of this philosophy is: business only. When young people make their educational choices based on this thinking, they want to pick a profession that would give them the fastest pay off on the college loans. Computer science and finance seem to be best suited for that. This narrow specialization looks like a reasonable way to get to earning money fast. This “business only” trend in education delusively looked (and still looks) like an excellent option. Why would someone need more skills and more knowledge than exactly required for doing their job with computers or with finances? That’s where the trap is. With several generations of engineers and tech business executives who majored in their narrow specializations, the community of tech professionals encased themselves in one-sided thinking and experience. As these people find themselves living in their tech-only universe, and at one point feel the lack of skills required for wise decision-making and business leadership, they only have their limited domain to look for clues and hence resort to the Big Data or to the AI panacea. Obviously, tech businesses are in permanent shortage of efficient leaders who can steer the wheel sensitively and wisely. The current setup of education, however, be it high school, college, or university, is little suited for raising such individuals. The point is that a proficient tech leader has to be knowledgeable in humanities, in addition to computer sciences. Someone with technical background will probably want some scientific proofs for that argument. I have those proofs, and I hope to share them in an article one day, although it might be quite hard to prove this technically. A technical background endows people with the ability to keep mental focus, no doubt. But the ability to focus alone is not enough. What if the focus is on the wrong target? From what I’ve observed, techies mostly fail in seeing how nuances and small things can make a big impact. Tech business leaders have to wear many hats, see the big picture, stick to common sense, and mix it with foresight and intuition. That’s what humanities bring to the table. People of such mixed background have become a scarce treasure these days.

Back to Big Data. This trend is there for a reason. Technical folks genuinely want to squeeze all they can out of their limited technical universe. That’s about the same as with Newtonian physics vs. quantum physics. It’s neither good, nor bad, just too narrow, and techies are taking the course of action they believe to be the best. There are exceptions, of course, but exceptions only confirm the rule. For example, this guy has delivered a presentation called The Ephemeral Role of Data in Decision-Making, and I’m certain that he has some sort of a humanities background behind, if not through the formal schooling then in some other way. Big Data is touted with many trumpets, and it would be a professional suicide in some organizations to stand up and question the common-sense side of this “golden” rush. Techies eagerly measure all they can, assuming that the count of molecules in water will dictate the shape of the ocean waves, figuratively speaking. By the way, the original concept of Big Data stood rather far from where it is now. It was about the physical space needed to store all the data in the 80’s — early 90’s. With the storage costs declining, the Big Data trend found an outlet in another barn and took its current shape. The overvalued significance of Big Data reminds me of the days when they sang lavish praises to Facebook, until with time it became clear that Facebook, along with its Big Data, is… well, by 2019 we probably, kind of, have an idea what it is.

Summing up, the Big Data trend and its misuse is rooted in the short-sighted approach to education, for one thing (and there are others). We are facing the consequences. Too few technical leaders can rely on their own guts and broad background in making balanced business choices. As if enchanted, they see nothing except the very questionable maxim that past trends will predict the future trends and hence provide some safe ground for their decisions. Hell, no. But someone will reap the produce as there’s a whole tribe of consultants and companies waving the banner that reads “leverage your Big Data”.

As you read the article, you’ve probably noticed that my conclusions come from various sources: trends in education, philosophy, social sciences. Thinking is the hardest job. Joining the flock is much easier. A mule tempted by a carrot will sheepishly follow the carrot anywhere without thinking. It takes more than being a mule to stop, look around, contemplate the hidden driving forces of things happening around, and take a sober unretarded decision as to what your organization needs. Big Data has to be approached with reason and caution. There’s no point to collect data just for the sake of measuring. There are many other more interesting things in the world to do, rather than that. Speaking of interesting things, humanities combined with computer science education — that’s what we need to nurture wise (tech) leaders. Yes, there are costs involved. But where will the cost be higher, and for whom: letting it all go with the same impaired narrow education, or finding a way to nurture the thinking individuals who are indispensable in any positions of governance, not only in tech?

Related:

Back to the Future of Agile Software Development

Prioritization and Big Data? Think Human Nature

Watch:

The Ephemeral Role of Data in Decision Making

This story is based on an earlier article.

--

--

Olga Kouzina
Quandoo
Writer for

A Big Picture pragmatist; an advocate for humanity and human speak in technology and in everything. My full profile: https://www.linkedin.com/in/olgakouzina/