Reducing complexity in the global supply chain

Libera Global AI
Clear AI
Published in
4 min readNov 5, 2019

Over the last few decades, the continuing march of globalisation has propelled us into a world of increasingly dynamic and interconnected global supply chains. This has created a huge amount of complexity, inefficiency and waste.

For example, as a result of supply chain complexity, approximately 30 percent of food produced for human consumption around the world is either lost or wasted each year. That equates to 1.3 billion tonnes of food and 1 trillion USD in economic costs, according to the Food and Agriculture Organisation of the United Nations.

Disorder characterises today’s global supply chain, which is flooded by massive volumes of data generated from disparate sources and systems. Though an impediment to progress, this disorder also represents a wealth of untapped value.

There exists an opportunity to decrease complexity by enabling collaboration in the form of a b2b data cooperative. By pooling information from across data silos and applying advanced analytical techniques to the resulting dataset, businesses opting in to the network will see both an increase in revenue and a reduction in waste. The larger the pool of data, the greater the opportunity.

The ability to organise and mobilise this sea of data — with the help of advanced analytical techniques (such as machine learning) — will go a long way towards activating dormant value in the global supply chain.

Organising global trade data

The introduction of digital technologies has enabled far greater productivity across global supply chains, but the resultant growth has plateaued in recent years. This is primarily due to stakeholders’ inability to utilise the vast quantities of data produced at every juncture of the process. As supply chains become increasingly complex and sprawling, the problem becomes ever more acute.

As a result of the inability to organise, digest and analyse these datasets, the global supply chain has become hugely siloed. Information exchanged across silos is often asynchronous, creating potential inaccuracies including overproduction and shortages. Companies have historically addressed this issue by building buffer inventories, a practice which itself directly affects profitability and efficiency.

Data fragmentation has introduced large-scale macroeconomic inefficiencies and has made effective planning and execution near impossible. It’s clear that current workarounds are ineffective, and a new way of organising and analysing supply chain data is required.

One remedy to supply chain complexity is the application of big data analytics, a form of advanced analytics that enables supply chain stakeholders to uncover patterns and correlations concealed within vast datasets. It can also allow large brands to complement insights drawn from data held on their own ERP and SCM platforms with more current and predictive data from the platforms of trading partners.

By applying powerful statistical methods to data that is created across global supply chains, businesses can minimise friction, increase efficiency and, most importantly, reduce complexity on an international scale.

In the context of food supply chains, for instance, this optimisation would result in drastic reductions in waste and equally significant cost savings.

Extracting dormant value

Advanced analytical techniques have the potential to minimise complexity and optimise supply chains by connecting disparate but relevant data sources. In particular, these new techniques are creating new opportunities in warehousing and transportation, allowing 3PLs to better support the needs of the MNCs they serve.

Third-party logistics providers are always looking to drive efficiency for brands, supporting their product demand with more effective logistics services. The creation of a b2b data cooperative and the application of big data analytics to this treasure trove of information can unlock profound value for 3PLs where this objective is concerned.

For example, a leading forklift provider is in the process of creating a forklift that can act as a big data hub, collecting all manner of data in real-time. This data is then blended with ERP and warehouse management system (WMS) data to identify additional waste in the warehouse process. Forklift driving behaviour and route choices can be assessed and dynamically optimised to drive picking productivity.

From a transportation perspective, big data analytics can be harnessed to analyse fuel consumption, optimise routing and reduce waiting times by allocating warehouse bays in real-time. Couriers have also started routing deliveries in real-time based on van geolocation and traffic data.

Gains resulting from these types of analyses may appear insignificant in isolation, but when taken in aggregate equate to vast reductions in complexity and huge sums in previously untapped value.

The disorder afflicting the global supply chain today represents a distinct business opportunity. Organisations looking to reduce complexity should embrace big data analytics as a means of making sense of the vast and diverse data-sets produced by activities across the supply chain. Advanced analytical techniques will allow businesses to increase revenue by facilitating better demand forecasting, and decrease the costs of waste by better aligning supply with demand and eliminating excess.

The potential for optimisation — and thereby the capture of previously untapped value — is great, but requires organisations to take a systemic approach to the adoption of advanced analytical tools. In order to reap the full rewards of big data analytics, it should be embedded at the heart of supply chain management. Only then will organisations be able to capitalise on the power of big data analytics to decomplexify the global supply chain.

--

--

Libera Global AI
Clear AI

Making invisible commerce visible with the power of AI