Delivering Change — How Advanced Analytics Transformed Air Freight’s Guessing Game

High drop-out rates are one of the biggest challenges of the air freight industry. This case study explores how advanced analytics helped overcome this challenge.

airplane silhouette with the sun and clouds in background

It is often assumed that an organisation’s data infrastructure is its most crucial asset when developing advanced analytics interventions. While access to high quality data is often valuable and helps analytics teams to hit the ground running, in our experience the real engine for change is the mindset and enthusiasm of an organisation. To illustrate this point, this article explores QuantumBlack’s recent work in the airline freight industry.

This project offered an exciting opportunity to deliver measurable commercial value in a sector that had been relatively untouched by analytics. This presented obstacles for the team — but the organisation we worked with were engaged and eager to embrace the benefits that modelling would bring to the industry.

Industry Context

Air freight makes up a large proportion of an airline’s revenue from shipped cargo. Yet while businesses routinely engage in bidding wars to have their cargo shipped on a flight, there are no penalties if they decide to cancel shipping at the last minute. This has resulted in a rather cavalier culture and often leaves airlines with no time for airline companies to re-book their newly available hold space. The high drop-out rate for cargo bookings results in planes flying often with significantly less cargo than their maximum capacity.

The airline company we worked with had identified this as a pain point for their business and were determined to improve their prediction capabilities to lessen the impact of inevitable drop-outs. Doing so would allow them to over-subscribe their bookings when the expected drop out rates were high, meaning a full cargo hold would be maintained and each flight would be far more profitable.

Obstacles To Innovation

As with the start of all data projects, the QuantumBlack team first assessed the analytics capabilities of the company and the data capture infrastructure in place. The first issue they encountered was that the backend data capture functions of the business and the industry as a whole were limited and underdeveloped. The industry-wide software platform and the programming language it used, Fortran, was not easily compatible with modern data forensics techniques, having been originally designed in the 1970s.

One of the greatest challenges to overcome was the variety of data sources the team needed to collate and unify in order to assess the recorded data. Both the cargo weight and volume were self-reported by the airline’s customers and were often unreliable, with a high degree of human error. In the existing solution lower than predicted cargo values were also being reported as no-shows. The data was stewarded by different teams through the company meaning it initially had to be manually collated and compared to find to what degree the bookings did not match the reality.

These factors meant that the data required by our team was stored across multiple sources, in different degrees of quality and format — and sometimes data was missing. It was difficult to build a clear picture of how much cargo was being really shipped in the first place, let alone how much capacity remained on each flight.

Delivering Change

After assessing the problem, the QuantumBlack team, in partnership with company leadership, recognised the need to model four factors: volume capacity, weight capacity, volume demand and weight orders. The solution would also need to detect differences between the reported and real measure and factor these into the final predictions. Without these tracked differences, the model would be trained on an inaccurate dataset, greatly reducing its reliability. As a global company with sales going on through the day, there was no time to batch transactions and tasks. The data had to be quickly accessible and analytics had to be done on the fly.

The project started with a pilot of nine routes. To build out the experimentation pipeline the team relied heavily on Databricks and Kedro, QuantumBlack’s own open source library for production-ready data analytics code. This would provide the structure for the airline company to work with the tools from the beginning, understand the development of the models and contribute their insight to the project.

Azure was used to develop the unified data lake, built up from the pre-existing back end system, Unisys. This system contained the critical datasets that described the airplane capacity and the flight schedules, so it was important to incorporate this existing technology. The team also relied on Pyspark code to transform the data published to different layers and to store it back in the data lake.

To speed up the performance, the team used the Redis system for caching. This front end layer greatly minimised the traffic from multiple requests. For example, if one salesperson from the airline requested an update on the capacity, the calculation is saved and accessible to anyone making the same request, rather than being calculated again from scratch.

Ending The Air Freight Industry’s Guessing Game

Our solution was capable of producing an API prediction of the no show rate on specific flights, and an accessible and actionable score that would publish into the improved front end tool with Azure Machine Learning services. This could be used as timely guidance for the freight company’s salespeople on how much more cargo to book for each flight.

The results clearly showed that the freight company had been consistently underbooking their available cargo space. The predictions allowed them to begin to overbook certain flights with the knowledge that come take-off, the accurately predicted drop outs would leave the cargo space near optimum capacity. This immediately and significantly improved the bottom line of the company, and the project was quickly scaled across nearly 2,000 routes.

Alongside the commercial impact, our colleagues inside the organisation had transitioned into a team that had confidence in the power of analytics and frequently saw this confidence rewarded. They were able to identify the weaknesses in their previous data system became skilled users and advocates of the new solution. Office space was transformed into the organisation’s dedicated analytics Center of Excellence and served as a positive case study to the wider company. The QuantumBlack team worked closely with the company to train their team in not only using the solution but also with building out further labs themselves.

When faced with what at first seemed like a monumental task, working alongside experts in the industry in a democratic fashion meant that the final solution was well suited to tackling a real-world problem. By being focused on the problem at hand and through carrying out a methodical approach to improve the data capture and storage, the end product proved to be an immense improvement on the previous system.

Tom Goldenberg, Junior Principal and Brendan Joyce, Junior Principal, QuantumBlack

Many thanks to contributors Bruce Philp, Alex Arutyunyants and Mayur Chougule

--

--

QuantumBlack, AI by McKinsey
QuantumBlack, AI by McKinsey

We are the AI arm of McKinsey & Company. We are a global community of technical & business experts, and we thrive on using AI to tackle complex problems.