Analytics now: leveraging HTAP to enable an analytics value chain without limits

a blog by Matthias Funke

Business applications are transforming to include analytics, and this is changing the way we do business. In the old paradigm, analytics were often operational reports that followed a predictable schedule of “data in” and “reports out” sometime later.

Today, analytics are much broader and we know that insights are most valuable when the data is freshest. For example, let’s assume you are racing a car in a Formula One race. How useful is information about the performance of your car after the race? Well if you want to win, you need information during the race so you can make continuous adjustments.

The same idea is true in today’s analytics. You need immediate information so you can make changes to deliver the best possible business result. So, the next evolution of analytics must provide insights on data even before it lands, and this is the key use case for Hybrid Transaction / Analytics Processing or HTAP.

HTAP meets a growing list of “analytics now” use cases

HTAP makes real-time analytics possible in the transaction or operational database — providing analytics across events as they are happening. The result is the ability to improve customer experiences, increase revenue, reduce risk or achieve another advantage by impacting events instead of just knowing they occurred. In our Formula One example, HTAP analytics may indicate when to make a pit stop and what items to adjust.

Obvious candidates for these insights are fraud detection, cyber security and physical security — where you must have information immediately. There are many more examples including customer service; supply chain and logistics; financial management; healthcare and many others. When applications have the ability to detect and respond through data insights, business itself transforms.

Simplifying IT with HTAP

There is also a basic benefit close to every IT manager’s heart and that is IT simplicity. HTAP helps consolidate workloads and reduce the number of systems capturing transactional data and running analytics. This simplifies the data architecture and reduces overall staffing needs and costs.

Many of you are thinking about the performance and tuning of key transaction systems when you add compute-intensive analytics. Advances in technology mean that today’s systems and software are capable of doing both types of processing simultaneously at high speeds. A number of technologies will play a partnering role in an HTAP scenario:

  • Event-Driven Applications are emerging as we become more agile. Event-driven applications are designed as a collection of micro-services that are orchestrated by a business process to provide application functionality. These micro-services can be delivered and changed in an agile fashion to meet changing needs and can provide differentiation. When analytical micro-services are combined in the application, they provide real-time monitors to drive intelligent business processes and the ability to identify and react to changing markets through real-time reporting or machine learning.
  • In-memory technology processes a large amount of data very quickly while minimizing impact to transaction performance. These applications bring the data into memory to reduce latency and speed results. In-memory technology includes both open source Apache Spark, and BLU Acceleration in the case of IBM DB2.
  • Machine learning works by analyzing, learning and enhancing itself to deliver a better result under changing conditions. Machine learning becomes even more powerful when it can work with transaction data as it is created.
  • Data governance catalog and data quality services help users understand where to find data, how to access it, and when to trust it. Remember — while HTAP is all about unifying different workload types and data for analysis, the benefit depends on how easily you can access data sources, and how well you liberate the end user from IT dependencies in accessing data.
  • Data science visualization and predictive algorithms are developed using Python, Scala, R, Spark, and machine learning. Once created, these algorithms can be pushed to HTAP systems for immediate insights.

Analytics now and without limits

With this range of technologies, you can deliver analytics that not only let you see what happened, but allow you to win your particular race. You can utilize all internal and external data sets, and you can make it available to all people and business processes with the right governance.

IBM data management handles a variety of workloads and types of analytics across the deployment environment of your choice (cloud, private cloud, on-premises systems). IBM announced that DB2 for Linux, UNIX and Windows will include HTAP capabilities.

IBM is making it possible to offer the complete analytics value chain. We can support you whether you need basic relational capabilities, want to make the move to the real-time HTAP world or need JSON unstructured data set capabilities. Use these links to begin exploring HTAP and learn what it can do for your business.

About Matthias

Matthias Funke is the worldwide leader of the IBM Data Management product line and strategy including key database and data warehouse products such as IBM DB2, dashDB and PureData System for Analytics. He is passionate about data as the “new currency” and looks for new ways to deliver insights from this data. Matthias brings many years of technology experience to his role including product management, software development and leading software development teams.