Prompt Engineering for LLM Applications with LangChain

In the ever-evolving landscape of Natural Language Processing (NLP), prompts have emerged as a powerful tool to interact with language models. They serve as the bridge between human intent and machine-generated responses. The Langchain framework stands out in this domain, providing a robust set of tools for crafting and optimizing prompts for various applications.

Prompt templates

Prompt templates are pre-defined recipes for generating prompts for language models.

A template may include instructions, few-shot examples, and specific context and questions appropriate for a given task.

LangChain provides tooling to create and work with prompt templates.

LangChain strives to create model agnostic templates to make it easy to reuse existing templates across different language models.

Prompt templates allow parametrizing and reusing prompts by separating prompt structure from prompt data.

There are essentially two distinct prompt templates available — string prompt templates and chat prompt templates. String prompt templates provides a simple prompt in string format, while chat prompt templates produces a more structured prompt to be used with a chat API.

PromptTemplate is the abstract base class, while StringPromptTemplate is a concrete implementation for string templates. PromptTemplate provides more flexibility and customization options.

Prompt templates expose input variables that get formatted into the prompt string via the format() method.

String Prompt Template

Chat prompt template

ChatPromptTemplate is for multi-turn conversations with chat history.

Custom Prompt templates

There may be cases where the default prompt templates do not meet your needs. For example, you may want to create a prompt template with specific dynamic instructions for your language model. In such cases, you can create a custom prompt template.

from langchain.prompts import PromptTemplate, StringPromptTemplate
from langchain.chains import LLMChain
from langchain.chat_models import ChatOpenAI

template = """
Given the user's name, write them a personalized greeting.

User's name: {name}

Your response:
"""

prompt = PromptTemplate.from_template(template)

class NamePrompt(StringPromptTemplate):

def format(self, **kwargs) -> str:
name = kwargs["name"]
return prompt.format(name=name)

name_prompt = NamePrompt(input_variables=["name"])

print(name_prompt.format(name="John"))

openai = ChatOpenAI(model_name="gpt-3.5-turbo")

LLMChain = LLMChain(llm=openai, prompt=name_prompt)

LLMChain.invoke({"name": "John"})
Response:

{'name': 'John',
'text': "Hello John! How are you today? I hope you're having a great day so far. Is there anything specific you'd like assistance with? Feel free to let me know and I'll be happy to help you out."}

So in summary, custom prompt templates allow you to create specialized, multi-step prompt formats tailored to your particular use case, beyond what the default templates support out-of-the-box.

Few-shot prompt templates

Here are some key points about few-shot prompt templates in LangChain:

  • FewShotPromptTemplate allows dynamically selecting a few examples to include in the prompt based on the input. This is useful for few-shot learning.
  • It takes in an ExampleSelector object that selects examples based on the input. Common selectors are MaxSimilaritySelector and LengthBasedSelector.
  • The ExampleSelector needs a list of examples and a way to format them, usually using a PromptTemplate.
  • FewShotPromptTemplate also needs a template for the overall prompt structure with placeholders for where to insert the examples.
  • When formatting the prompt, it runs the input through the ExampleSelector to pick examples, formats them, and inserts them into the overall prompt template.
  • This allows the prompt to dynamically tailor the examples shown based on the specific input.
  • It enables few-shot learning by showing the most relevant examples for a given input.
  • The examples can come from a dataset, be generated on the fly, or even added dynamically.

I will try to explain this with an example. Have taken a menu, with its name and description. From them, we need to derive the primary, secondary and tertiary proteins, lettuce and cheese types as appropriate.

Here are the few shot examples :

examples = [ 
{
"menu" : {
"menu_item" : "Just Veggie Omelet",
"menu_description" : "Sauteed spinach and mushrooms, goat cheese."
},
"attrib_structure" : {
"primary_protein" : "veggie",
"secondary_protein" : "egg",
"tertiary_protein" : "",
"Cheese Types": "Cheese",
"Lettuce Types": "Spinach"
}
},
{
"menu" : {
"menu_item" : "Buddha bowl",
"menu_description" : "Assorted vegetables with egg, bacon, chicken, Spinach, Cheddar and black red bean sausage",
},
"attrib_structure" : {
"primary_protein": "chicken",
"secondary_protein": "bacon",
"tertiary_protein": "egg",
"Cheese Types": "Cheddar",
"Lettuce Types": "Spinach"
}
},
{
"menu" : {
"menu_item" : "Bacon topped grilled Organic Chicken Breast",
"menu_description" : "Tomato, Spring Mix with American cheese, suasage and avacado",
},
"attrib_structure" : {
"primary_protein": "Chicken",
"secondary_protein": "Bacon",
"tertiary_protein": "sausage",
"Cheese Types": "American Cheese",
"Lettuce Types": "Spring Mix"
}
}
]
from langchain.prompts import (
FewShotChatMessagePromptTemplate,
ChatPromptTemplate,
)

# This is a prompt template used to format each individual example.
example_prompt = ChatPromptTemplate.from_messages(
[
("system", "Enter Menu Description:"),
("human", "{menu}"),
("ai", "{attrib_structure}"),
]
)
few_shot_prompt = FewShotChatMessagePromptTemplate(
example_prompt=example_prompt,
examples=examples,
)

print(few_shot_prompt.format())
System: Enter Menu Description:
Human: {'menu_item': 'Just Veggie Omelet', 'menu_description': 'Sauteed spinach and mushrooms, goat cheese.'}
AI: {'primary_protein': 'veggie', 'secondary_protein': 'egg', 'tertiary_protein': '', 'Cheese Types': 'Cheese', 'Lettuce Types': 'Spinach'}
System: Enter Menu Description:
Human: {'menu_item': 'Buddha bowl', 'menu_description': 'Assorted vegetables with egg, bacon, chicken, Spinach, Cheddar and black red bean sausage'}
AI: {'primary_protein': 'chicken', 'secondary_protein': 'bacon', 'tertiary_protein': 'egg', 'Cheese Types': 'Cheddar', 'Lettuce Types': 'Spinach'}
System: Enter Menu Description:
Human: {'menu_item': 'Bacon topped grilled Organic Chicken Breast', 'menu_description': 'Tomato, Spring Mix with American cheese, suasage and avacado'}
AI: {'primary_protein': 'Chicken', 'secondary_protein': 'Bacon', 'tertiary_protein': 'sausage', 'Cheese Types': 'American Cheese', 'Lettuce Types': 'Spring Mix'}
from langchain.chains import LLMChain
from langchain.chat_models import ChatOpenAI

final_prompt = ChatPromptTemplate.from_messages(
[
("system", "You are a wondrous menu ingredient mapper."),
few_shot_prompt,
("human", "{menu}"),
]
)

openai = ChatOpenAI(model_name="gpt-3.5-turbo")

LLMChain = LLMChain(llm=openai, prompt=final_prompt)

menu = {
"menu_item": "Chicken Combo",
"menu_description": "Served with Shrimp, Sauteed vegetables, sausage, mushrooms, Spring Mix and American Cheese."
}

LLMChain.invoke({"menu": menu})
Output : 

{'menu': {'menu_item': 'Chicken Combo',
'menu_description': 'Served with Shrimp, Sauteed vegetables, sausage, mushrooms, Spring Mix and American Cheese.'},
'text': "{'primary_protein': 'Chicken', 'secondary_protein': 'Shrimp', 'tertiary_protein': 'sausage', 'Cheese Types': 'American Cheese', 'Lettuce Types': 'Spring Mix'}"}

--

--

prasanth vedantam [www.ainxus.com]

[Co-Founder & CEO - AINXUS ] | [Specialize in partnering with clients to drive digital transformation initiatives and build cutting-edge products and solutions]