A Large Language Model is not enough: introducing Context Fusion & Toolkit in enterprise solutions

Part 1: Building enterprise LLM solutions powered by a Contextual Powerhouse

Marc van Eck
Merkle Data & Technology
7 min readJun 22, 2023

--

The landscape of generative artificial intelligence (GenAI) has been rapidly evolving over the last few months. The way text is interacted with and human-like responses are generated has been revolutionized by Large Language Models (LLMs). However, it is important to recognize that relying solely on a Large Language Model is not enough to fully unlock its potential.

This first article aims to shed light on the limitations of LLMs and present valuable solutions that address these gaps. In the second article, I will touch on how these solutions helped us create our own enterprise solution called Genai, and how it can help in other marketing use cases.

Before we dive in, a short introduction for those unfamiliar with Large Language Models:

Large Language Models are neural network-based models designed to process and generate human-like text. Because they are trained on extensive datasets, LLMs learn language patterns, semantic relationships, and context. And by leveraging deep learning techniques, LLMs can generate coherent and contextually relevant responses.

TL;DR: this article in short

While LLMs have demonstrated remarkable capabilities, incorporating domain-specific context is crucial for achieving more accurate and relevant outputs. To bridge this domain-specific gap, an innovative approach called Context Fusion is introduced and defined as:

Context Fusion: The process of combining diverse sources of business data and contextual information to enrich the LLMs knowledge and improve its business-oriented performance.

By seamlessly integrating business-specific knowledge, Context Fusion empowers enterprise LLMs, enabling them to generate contextually appropriate responses and insights. In the end, this will make them respond more as a human working for a specific business.

Additionally, we explore the importance of adding Toolkit to further enhance Context Fusion. Toolkit is defined as:

Toolkit: An extensive set of (custom-built) tools and functionalities that seamlessly integrate with the contextual powerhouse, amplifying the LLMs capabilities and enabling tailored business outcomes.

Toolkit serves as a comprehensive set of tools and resources designed to augment the functionality and usefulness of LLMs. By incorporating the Toolkit, LLMs gain extended capabilities, such as search functionalities, API connections and more, which contribute to more valuable and reliable solutions.

By incorporating Context Fusion and Toolkit, organizations create a true Contextual Powerhouse which can support companies in catering LLMs to their specific needs while adding effective governance, context and tools.

Contextual Powerhouse Ecosystem, courtesy of the writer

Creating a Contextual Powerhouse

Exploring the Benefits of Context Fusion in LLMs

An LLM on its own is not enough to generate the best human and context-accurate responses. To support LLMs in their task, whatever the use case may be, we need to fuse the knowledge of an LLM with context that is specific to a topic, person, business or industry. This concept is named and defined as:

Context Fusion: The process of combining diverse sources of business data and contextual information to enrich the LLM’s knowledge and improve its business-oriented performance.

One of the primary advantages of Context Fusion is its ability to enhance language model performance by making information explicitly available. Traditional LLMs often lack the capability to fully understand and generate contextually accurate responses within specific domains. Given that LLMs like those of OpenAI’s are trained on extremely large datasets and have a cutoff point in knowledge, they can be unaware of specific details or recent developments. To stay with OpenAI as an example, its latest GPT-4 model is said to have 1 trillion parameters with a dataset of 45 gigabytes. Chances are that very company-specific data is simply not in there, or hard to find as a needle in a haystack.

By incorporating Context Fusion, businesses can provide the LLM with the necessary contextual information to generate much more precise and business-specific outputs. This can be beneficial for many use cases, such as chatbots and tailored content creation, which will be discussed in part 2 of this series.

With Context Fusion in place, it enables businesses to unlock valuable insights from their (publishable) data. By integrating domain-specific knowledge such as industry or business terminology, previous work, product offerings, regulatory frameworks, market trends, or any other relevant piece of context, LLMs can generate responses that align closely with the complexities of the business environment. It also helps in making the LLM more human, warm, and in line with brand guides. This allows for improved decision-making, more accurate customer interactions, and enhanced operational efficiency.

Furthermore, Context Fusion addresses the need for governance and control in LLMs. As businesses increasingly rely on AI-powered language models, ensuring compliance with regulations, ethical considerations, and company policies becomes paramount. Context Fusion provides a mechanism to incorporate governance frameworks into LLMs, allowing businesses to maintain more control over the generated outputs and mitigate potential risks.

The description above can be visualized in the flowchart below, where we see both Context Fusion and governance frameworks interact.

Context Fusion within the ecosystem, courtesy of the writer

Beyond intelligence integrations with Toolkit

To further expand the capabilities of Context Fusion, businesses can integrate a powerful Toolkit into the LLM infrastructure. This Toolkit serves as an ensemble of additional tools and functionalities that complement the Context Fusion process, enabling even more advanced and diverse applications.

Toolkit: An extensive set of custom-built tools and functionalities that seamlessly integrate with the contextual powerhouse, amplifying the LLM’s capabilities and enabling tailored business outcomes.

A very common tool to implement into the LLM infrastructure is the Calculator module. Large Language Models are focused on, indeed, language. LLMs excel in generating human-like text and responses, but their ability to accurately perform complex mathematical computations is still an area of improvement. Adding a Calculator module empowers the LLM to perform complex calculations, making it an invaluable asset for financial institutions, data analysis, or any situation that requires computational capabilities.

An example of a Calculator tool, specified to multiplications in this case, courtesy of the writer

A compelling advantage of the Toolkit is the ability to incorporate custom-built tools, offering businesses unique capabilities. For instance, an e-commerce company could develop a tool that integrates product recommendations and personalized marketing strategies directly into the LLM. By leveraging customer data and their preferences, this custom-built tool empowers the LLM to generate tailored product suggestions, promotional offers, and persuasive marketing content. With the ability to understand customer preferences and deliver targeted marketing messages, the LLM becomes a valuable asset for driving conversions, enhancing customer engagement, and ultimately boosting sales.

Additionally, the Toolkit may include tools for sentiment analysis, API connections, next-best actions, and more, depending on the specific needs of the business. Each tool seamlessly integrates with Context Fusion, creating a comprehensive ecosystem that empowers the LLM with a broad range of functionalities.

Visualizing Toolkit and its relation to Context Fusion results in the flowchart below.

Toolkit within the ecosystem, courtesy of the writer

Embrace AI-Powered Contextual Solutions Today

In conclusion, the introduction of Context Fusion and the Toolkit represents a transformative step in harnessing the true potential of Large Language Models (LLMs) for businesses. By seamlessly integrating business-specific knowledge, Context Fusion empowers LLMs to become contextual powerhouses, delivering accurate and relevant outputs tailored to the needs of businesses and customers. The Toolkit further enhances Context Fusion by providing a comprehensive set of custom-built tools and functionalities that amplify the LLM’s capabilities and enable tailored business outcomes.

The key takeaways from this article are:

  1. Context Fusion enables LLMs to generate contextually accurate responses within specific domains, improving language model performance and enhancing decision-making, customer interactions, and operational efficiency.
  2. Incorporating governance frameworks through Context Fusion addresses the need for control and compliance in LLMs, mitigating potential risks associated with AI-powered language models.
  3. The Toolkit complements Context Fusion by integrating additional tools and functionalities, such as a Calculator module for complex mathematical computations and custom-built tools for personalized marketing strategies, sentiment analysis, API connections, and more.

In part 2 of this series, Context Fusion and Toolkit will be discussed from a more practical view. Within Merkle, we created our own solution called Genai which leverages Context Fusion to formulate Merkle-specific answers. Besides that, other marketing use cases applying Context Fusion and Toolkit will be discussed. See the link below for part 2!

Merkle’s Genai will be discussed in part 2, courtesy of Merkle NE

To unlock the full potential of generative AI and explore how Context Fusion and the Toolkit can benefit your business, we invite you to reach out via Merkle’s website or reach out on LinkedIn. Embrace the power of contextual intelligence and revolutionize your business outcomes today.

--

--

Marc van Eck
Merkle Data & Technology

Senior Data Scientist at Merkle Northern Europe, based in Amsterdam. Writing about Data Science & Cloud Engineering. Connect via www.linkedin.com/in/marcveck/