How I Build LLM Framework for Intelligent Propagation of Prompt Request based on Type

Nayan Paul
3 min readMay 20, 2023

Problem Statement

Recently I was in an internal discussion talking about some of the use cases for generative AI and what kind of business problems we want to solve (yes I talk a lot ! 😀 ) .

Let me start by stating few of the things (on that discussion) that made me write this blog. Firstly during the discussion I was asked — how can we answer a question like “ My company has multiple tax policies, each policy has specific capabilities and cost structure and FAQ’s etc. How can we interact with LLM and answer questions that are infused with the the context from organizational information.” — this is a classic LLM case of grounding and in-context learning for answering questions ( and I have a dedicated blog on that @ https://medium.com/@nayan.j.paul/how-i-used-large-language-models-llm-to-automate-personal-slack-bot-for-qna-de9b8c2a1414). We have done that and I was very confident in answering that question by pulling up the blog I mentioned.

Next, came the curve ball — “how can we ask LLM about which policy I have already selected earlier ?” — this is a case of someone asking a pointed question to go and fetch details from a database (or something similar) — hopefully we should not generate that answer but state the exact value in the database. This is also something we have solved independently. How to access a database and get a status of a ticket or the value of an existing field in a database.

--

--