Move Over, ERP: Why Data Hubs Belong in the Center of System Landscapes

Timothy Becker
t14g
Published in
7 min readDec 22, 2021

Around the world, corporate IT landscapes have one thing in common: the enterprise resource planning system — ERP for short — is the central hub that brings it all together. For many companies, though, data hubs may offer a better path to future success.

For decades, ERP systems have cemented their role at the heart of enterprise IT infrastructure, while the software vendors behind them have become tech giants in their own right, with an ever growing lock-in on their customers. One thing often gets overlooked: ERPs were never meant to serve this central function, at least not when they were originally developed. The fact that they now do can make integrating new technologies highly cumbersome and lead to staggering costs — effectively creating greater and greater barriers to innovation.

This is because no modern enterprise anywhere in the world works without data. Whether it’s financial accounting, digital sales, production, or logistics — every corporate department depends on transferring information quickly from other divisions in order to be able to work effectively, efficiently, and strategically. How data flows internally has an immense impact on the ability to operate successfully and then be able to innovate with new services or product offerings. But how does data flow within companies currently? In many cases, most or all of the data is run through ERPs as its central handling point.

It all revolves around the ERP

ERP systems were originally developed in the 1980s as software for material requirements planning. Their main task was to link the areas of purchasing, production, and sales. Over the decades, they grew, adding numerous extensions and interfacing with more and more areas, such as customer and supplier management, business intelligence, and e-commerce. As a result, the ERP moved from the periphery to the centre of the system landscape, meaning an increasing amount of data ultimately had to run through the ERP.

For a long time, this was considered an improvement. But is that still true? In recent years, numerous companies have had a terrible time integrating ERP systems. The list of failed ERP projects includes companies like Deutsche Post, Lidl, Otto Group, and even candy brand Haribo. The level of customization necessary to make it work leads to projects that get ever more complex, while costs and project runtimes go through the roof. In addition, companies seem to be increasingly uncomfortable at the thought of being dependent upon a single software vendor.

The problems with ERPs

How do these problems come about? In some cases, ERP systems are hampered by slow databases. They simply cannot keep up with their increasingly complex role as the data hub of an entire enterprise. Some companies decided to implement ERPs that are comparably less resource intensive — a smart strategy in the beginning in order to stay as lean as possible. As the companies have grown and their requirements have changed, though, the system was continuously adapted more or less on the fly, often resulting in a lack of documentation and transparency. Moreover, the ERP cannot fundamentally sync the incoming and outgoing data streams, but must transform them at each of the two interfaces. Generally, data quality and the overall expenditure for data maintenance are an issue in many organizations. Overall the effort for entering, searching, and maintaining data is high, resulting in poor data quality, such as outdated and incomplete customer records or duplicates.

If that were not enough, there is also the general lack of good integration, interfaces, and adaptability when it comes to ERPs. When certain systems have to be updated or replaced, this then affects the data stream to all other systems. Development teams often have to set up new improvised data flows in strenuous ways just to ensure that existing applications continue to run smoothly. This is especially true nowadays, where all applications that work with customer data must be linked with each other. On the one hand, this guarantees a seamless customer experience even across different areas of the company. On the other hand, it ensures that all personal data is properly categorized and securely managed.

The effect: drain on resources and untapped potential

The disadvantages of the ERP system as a central hub for data do not only result in restrictions for the IT department, but for the entire company. The elephant in the room: ERPs can be a powerful brake on innovation and maneuverability. Integrating them with other systems is often highly resource intensive and many business processes have to be adapted to the ERP system’s logic. Since customizing ERPs is so tedious and expensive, companies are tempted to stick to standard processes. Thus, business processes are no longer designed freely and then implemented — if it cannot be done within the confines of the ERP, it can’t be done at all. Ultimately, a company’s ability to innovate suffers because the ERP drains resources and stifles new initiatives — because “our system can’t do that”. A system architecture with an ERP in the center often limits itself.

Shifting the paradigm towards the data hub

The primary idea behind a data hub is to place a central layer between the other system layers of the enterprise, serving somewhat like its spine. This layer governs the entire data structure including state changes of all data records. Additionally, it bundles the role management and all interfaces. All systems from production, financial accounting, commerce or customer relations are connected to the hub via standardised interfaces and can exchange data with each other quickly and easily via the hub’s API. In contrast to the linear and branched communication in the ERP system, the hub uses a harmonized standard data structure, which is not dependent on the other systems. In this way, data can be transferred at will in a flexible way

The data hub enables a flexible data flow between all systems. (Source: Turbine Kreuzberg)

The resulting independence of the data hub also means it is generally easier to implement. The hub is initially set up as a completely separate entity from the rest of the IT architecture. It consists of just three components: a system for API management, an event handler, and a powerful database system. This paints a strong contrast to the generally monolithic structure of the ERP. The data hub can be custom-built using open source technologies, for example by developing the hub’s core using TypeScript, a modern programming language built on JavaScript. Building with readily available, open-sourced technologies increases the longevity and sustainability of the solution being developed, because the risk of relying on outdated or deprecated proprietary components gets minimized.

Gradual integration through testing and optimisation

Once the three core components are assembled, the hub can continue to be tested and optimised — but largely separate from other applications. Only when the resulting performance meets the requirements is the hub gradually integrated with other systems. This has one big advantage: through gradual migration, the old and new data architecture can work in parallel, meaning the data layer can also be introduced in ongoing operations — open heart surgery, if you will. As soon as the hub is connected to all applications, the old infrastructure can be switched off.

Intelligent data links open up opportunities for innovation

Implementing data hubs can have an immense impact on how companies develop in the future. For example, a central data layer makes it possible to link areas in ways that used to be too difficult to implement: user behaviour, inventory information or even external weather data can be used to drive intelligent procurement services, while it becomes easier to recognize correlations in business processes or areas to make them more efficient. The data required for this is often already readily available. The difference is in the fact that it is now stored centrally, synchronized and non-redundant in a pool to which all systems have direct access.

A company’s data flow is a decisive factor in its flexibility and ability to innovate. In order for the company’s own data to be used intelligently and in a way that adds value, we have to first overcome monolithic structures for storing and processing it. Only a truly neutral and synchronized data hub gives companies back control over their vast treasure troves of data. And it allows them to shape the future of their organisations.

Thoughts? We want to hear what you think. Drop us a message at hello@turbinekreuzberg.com.

--

--

Timothy Becker
t14g
Writer for

Technology Innovation @Turbine Kreuzberg unleashing the potential of IoT, decentralized tech and web 3.0.