QuantumBlack Horizon: Unleashing the power of generative AI

Scaling AI with tried-and-tested industrialization processes

--

Generative AI (“gen AI”) can classify data, summarize information, answer questions, and draft new, original content. It has potential to transform work across multiple business functions and workflows. A hot topic since ChatGPT, Bard, GitHub Copilot, Midjourney and others joined the mainstream, decision makers are excited by the potential of gen AI to improve productivity, streamline operational efficiency, and drive innovation.

McKinsey research estimates that gen AI could earn the equivalent of between $2.6 trillion and $4.4 trillion annually. Consider just one use-case: a sales call where the salesperson needs to find the best opportunities to upsell to a customer. Before gen AI, the opportunities would typically be modelled on static customer data obtained before the start of the call, such as demographics and purchasing patterns. But a gen AI tool can use real-time information from the ongoing conversation, along with the organization’s overall customer trends, market information, and beyond, to build a personalized script for the salesperson. Indeed, one of the strengths of gen AI is that it can approximate human behavior closely and, in some cases, imperceptibly. Offering gen AI input to a sales team enables it to get work done faster and better.

Gen AI encompasses not only text generation but also images, video, audio, and code. Organizations across a range of sectors are standing up use cases that capture its potential, and those that do not explore it risk falling behind their competitors.

Decision makers should view gen AI as a transformative opportunity, allowing them to reimagine aspects such as R&D, marketing, sales, and customer operations.

Challenges of adopting gen AI

While gen AI expands analytics capabilities, it introduces new challenges too. There are practical considerations in terms of risk management and responsible use. Additionally, there are significant technical considerations such as the need to scale to accommodate larger volumes of data and model complexity.

Gen AI models require a large amount of data to train and generate accurate results. Collecting and preparing the data for training can be a challenge, especially for companies with limited resources. The models primarily work with unstructured data sources such as images and PDF reports, which are not easily consumed in a data science environment and need to be preprocessed.

Selecting the right model for gen AI can be difficult and organizations will typically use several models of varying size, complexity, and capability. To generate value, these models need to be able to work both together and with the business’s existing systems or applications.

Training and optimizing gen AI models can be a time-consuming and complex process. The pace of change requires keeping up to date with different versions being released and having a process for tracking the ongoing performance of the model.

An organization needs suitable infrastructure and resources in place to deploy and maintain their AI solutions. Gen AI exacerbates existing failure modes in the AI ecosystem, such as inconsistent and inefficient AI deployments and organizations that are under-equipped to scale.

Companies will need to adopt a technology stack that may differ from the existing expertise of their technical teams, and success is predicated upon a multi-disciplinary digital transformation. Indeed, McKinsey research uncovered that only 3% of companies have adopted AI across 5+ core business functions, which represents missed opportunities to use AI to drive efficiency, lower costs, and create new revenue streams.

Use case: How gen AI maximizes the value of banking data

Banks are rapidly accumulating growing volumes of transaction data due to the surge in online banking, mobile transactions, e-commerce, and digital payments. Extracting value from this surge of data requires sophisticated analytical tools, which can enhance banking operations, such as by detecting fraudulent activity, or support customer management such as extending lifetime value and preventing churn. Financial institutions that innovate can achieve operational efficiency and gain significant competitive advantage. There is a distinct prospect for the banking sector to adopt tools, such as those built on gen AI, to increase engagement and revenue generation.

QuantumBlack, AI by McKinsey, is independently developing a product focused on extracting meaningful insights from banking transactions, such as transaction classification, anomaly detection, and merchant identification from text. These insights support downstream practical use cases such as fraud detection, anti-money laundering, and underwriting.

The product presents advanced analytics modules featuring transaction classification through gen AI and deep learning models. It also offers crucial AI-driven core banking capabilities, such as merchant identification from transactions, language translation for downstream analytics, geotagging using NER + gen AI from transaction texts, and anomaly detection for enhanced fraud prevention. With the flexibility to be seamlessly hosted on a bank’s infrastructure, its modules can be effortlessly managed by the bank’s internal analytics teams.

While it is currently still in R&D, once available, this new product is predicted to accelerate transaction analytics-based models by 40–60%, particularly in the customer life value, fraud detection and financial crime use cases.

The challenge of building analytics for meaningful insights

Working with banking data presents formidable challenges because it is sensitive and contains private details about individuals and businesses. Data mishandling can result in privacy breaches, financial setbacks, and tarnished reputations, so stringent regulations are in place.

To avoid delays during experimentation and ensure a compliant development process, it’s common to create synthetic data to develop and refine AI models to ensure regulatory compliance. The fabricated data simulates real-world situations for AI training while upholding the highest standards of data security and confidentiality. For this reason, the product has a synthetic data generation capability.

The transactions data generator combines the Data Fabricator tool from QuantumBlack Horizon with the power of OpenAI to simulate banking transaction data that is locale-specific and realistic, but completely artificial.

Kedro, one of Horizon’s flagship products, is used to build the data engineering and data science pipelines. It is a powerful tool that has transformed the way data professionals collaborate, by providing a structured approach to help them write reproducible, maintainable, and modular data science code. Individual components of the data processing workflow, such as data loading, transformation, and output, are treated as separate, independent tasks to facilitate scalability.

Kedro integrates seamlessly with other tools and platforms to enable the construction and deployment of robust data pipelines in varied environments. It enables the use of pipelines for standalone use in single use cases, or in combination, from data engineering through categorization to downstream ML banking modules for financial risk, credit risk and growth.

How QuantumBlack Horizon can benefit your organization

QuantumBlack Horizon is a suite of tools for AI development and deployment. Announced in June 2023, it enables organizations to leap from proof-of-concept to productionized AI at scale. Horizon helps McKinsey clients discover, assemble, tailor, and orchestrate AI projects.

The tools in Horizon reduce the time required to realize value from analytics and ensure consistent returns across a portfolio of use cases. They are built to be flexible, interoperable, and compatible with all key technology platforms and modern tech-stack components.

Horizon also complements an existing data and analytics technology landscape. It supports organizations with existing silos of analytics excellence that have implemented some use cases but have yet to scale them because of fragmented data and AI platform ecosystems. In these scenarios QuantumBlack Horizon augments existing workflows and technology rather than replaces them.

QuantumBlack Horizon supports gen AI projects

Developed to serve McKinsey clients, the Horizon suite sets up the “factory-like” foundations needed for gen AI projects facing digital transformation, grounding them with a sustainable tech stack that includes the following:

AI4DQ, FUSE2 and Data Fabricator: effective data curation. Gen AI models primarily work with unstructured data sources, which are not easily consumed in a data science environment. Collecting and preparing the data for model training can be a challenge, especially for companies with limited resources, or those working with fragmented data quality and data validation programs. AI4DQ is a customizable data profiling and repair kit that uses AI to fast-track the transformation of messy data into robust AI fuel. FUSE2 helps resolve spelling inconsistencies in your data records. Data Fabricator generates realistic mock data for AI development without exposing sensitive customer information.

Kedro, Brix and Alloy: standard protocols and repeatable methodologies. As organizations adopt new AI use cases, we regularly see them struggle to accelerate them into production. Workflows remain non-standard, teams do not leverage reusable code and information sharing is limited. Kedro allows teams to build assembly lines that create AI use cases by standardizing workflows. Brix is a home for reusable code components so that teams are not reinventing the wheel. Alloy provides the blueprint for writing reusable code blocks.

Iguazio: Continuous rollout of new AI services that drive business innovation. The specialist infrastructure required for MLOps and LLMOps is changing rapidly with constant drive to develop, deploy and manage gen AI projects whilst reducing risk and minimizing cost. Iguazio makes it possible to deploy models faster while ensuring production-readiness. Integrated monitoring spots issues before they spread. MLRun manages AI applications across their lifecycle and reduces engineering efforts. Nuclio minimizes maintenance overhead and automates deployment of AI applications in any environment.

Conclusion

In this use case, we described a recent QuantumBlack Labs initiative, built upon gen AI. It’s just one of many novel use cases which look set to revolutionize work processes for fast, flexible analytics development

We recognize that scaling AI is more than a technology problem. QuantumBlack Horizon functions as an end-to-end factory for analytics development and deployment. Horizon shifts the organizational mindset towards the AI industrialization processes needed in order to develop, deploy, monitor, and refactor AI. It is designed to optimize the efficiency and consistency of AI project delivery and accelerate model development over subsequent projects.

To learn more about what QuantumBlack Horizon can do for you, please email Yetunde Dada.

--

--

QuantumBlack, AI by McKinsey
QuantumBlack, AI by McKinsey

We are the AI arm of McKinsey & Company. We are a global community of technical & business experts, and we thrive on using AI to tackle complex problems.