Is Change the only Constant in Data Analytics?
“The first step toward change is awareness. The second step is acceptance.” -Nathaniel Branden
If we look back today we can see a big shift in Technology. But as far as Data Analytics is concerned, we have seen it evolve faster than any other practise. Data Analytics has always provided a competitive advantage to companies in every industry be it retail, production, genetics or even rocket science for that matter. At the same time, it has also become very tempting for businesses to shift to new technology infrastructure and attune to new tools.
How it evolved?
Data Analysis has evolved from EDA to Cognitive thinking and learning. Earlier the main objective of a Data Analyst was mostly reporting or presenting the data in a more organised way in the form of reports or dashboards. Now it’s one cognitive analysis built upon another which helps develop comprehensive analytics strategies.
As far as I have seen, it was not too long ago that Excel Spreadsheets were used to analyse all types of data across industries. Excel was then believed to do magic with its in-built functions.
As the volumes of data increased, data servers and storage became very significant. Now since all the data was getting stored, it also had to be processed to give meaningful information. Hence SQL queries became the rescuer to such problems.
Excel and SQL were self sufficient for quite some time until Business Intelligence platforms came into the picture. These platforms made Predictive Analytics a lot simpler.
Experts say that Analytics Industry have seen three generations: Analytics 1.0, Analytics 2.0 and now the Analytics 3.0 era. Analytics 1.0 was the era of Business Intelligence where majority of analytical activity was descriptive analytics, or reporting. Analytics 2.0 was the era of Big Data. The data was mostly unstructured and came from internal transaction systems and also from other internet sources.
Now we are in the era of Analytics 3.0 which has witnessed a substantial overhaul. Big Data Analytics, Machine Learning, Cognitive Computing and Cloud storage platforms, all operate simultaneously. We can now get actionable insights on huge volumes of structured and unstructured data in fraction of seconds. The strategic decision making is supposed to get more streamlined.
In a matter of fact what seems more surprising is how fast analytics revamped to suit the requirements of each of these era.
The Changing Technology Stack
Analytics Tools have evolved with changes in infrastructure, technologies and evolution in IT industry.
Until a couple of years back, SAS was supposed to be a very sought after certification. In a matter of few years, SAS is now fading away from the picture.
To analyse the data, Oracle SQL and MySQL were most widely used technologies but now they are being replaced by Hive and Amazon for their capabilities to provide faster transactions on huge volumes of unstructured data.
There are open source languages which have in-built software packages. Python,Scala, Druid are some languages which have the capabilities of distributed programming which can parallelise the process.
Microstrategy has been the leader for quite sometime. Now we can see the market shifting Tableau, Sling etc. Tableau which is known for its potential to integrate with almost all the platforms looks like quite a sustainable product nowadays.
And it’s not just the tools which are changing. Every business now looks forward to a customised analytics package to solve its business needs. We have retail analytics, sales analytics, service analytics, HR analytics and it just goes on. For example in a retail set up, businesses would look forward to a solution which provides inventory optimisation, store level management, supply and demand planning, customer experience and insights etc. On the other hand Cricket Analytics demands dashboards with many charts, KPIs and metrics, Predictive Analytics Popular match analytics etc. So why would businesses adopt one solution to solve different purposes. Hence we can constantly see new tools in the market with streamlined processes for each industry and its specific requirements.

What’s Next
This change is inevitable. Businesses are doing huge investments in infrastructure. Also with every second the volumes of data is exponentially increasing. The existing technology has to evolve to support such huge volumes of structured and unstructured data.
Hence it is likely that we would witness more dramatic changes in technology. There would obviously be new tools, platforms, programming languages and everything that would be required to embark on this revolution.
So with this volatility, I don’t see a point in mastering in any one of these tool or programming language as we progress in our professional career. I would like to conclude against the odds by saying that now it is better to be “Jack of all trades and master of none”.
