Prompt Templates in LangChain

Sami Maameri
6 min readApr 1, 2024

--

Photo by Jeswin Thomas on Unsplash

Do you ever get confused by Prompt Templates in LangChain? What do the curly brackets do? How do you pass in the variables to get the final string? How do multi-line strings work? What about f-strings?

If these questions sound familiar to you, you have come to the right place! I found it confusing also, so have listed out here how a lot of these variations work. A lot of it is just knowing how python string literals work, e.g how multi-line strings, or f-strings work, so if you are not too familiar with those, it can be extra confusing when you see them pop up in PromptTemplates.

Well, that was the case for me at least.

Let’s dive in!

Contents

Setup

How to Create a Prompt Template

Single v.s Double Quotes

Prompt Template with pure strings

Prompt Template with variables

Creating Prompt Templates via the Constructor

Prompt Templates with Multiline Strings and Variables

Prompt Templates with f-strings and Variables

Prompt Templates with Multiline f-strings and Variables

Using a PromptTemplate in LLM Calls

Conclusion

Setup

A few things to setup before we start diving into Prompt Templates. To follow along you can create a project directory for this, setup a virtual environment, and install the required packages

mkdir prompt-templates
cd prompt-templates
python3 -m venv .venv
touch prompt-templates.py

pip install python-dotenv langchain langchain-openai

You can also clone the below code from GitHub using

git clone https://github.com/smaameri/prompt-templates

Or also check it out inside a Google Colab if you prefer

At its core, a PromptTemplate is just a string template we can pass variables to in order to generate our final string. And we can then pass these PromptTemplate’s to LLM’s in order to create consistently formatted prompts for the LLM.

How to create a prompt template

Usually we create the PromptTemplate using the from_template method, and call the format method to generate the final string

prompt_template = PromptTemplate.from_template('my string ..')
print(prompt_template.format()) # -> 'my string ..'

More details below. But first ..

Single v.s Double Quotes

There is no difference between single and double quotes in python so do not let that confuse you. They can be used interchangeably.

One case it could make sense is if your string also includes a single or double quote inside it, in which case you may want to use the opposite kind to encapsulate the string literal, so the quote inside the string does not break the encapsulation. Of course you could alternatively just escape ( \ ) the quote that shows up inside the string.

In any case, single and double quotes are basically the same, so don’t let that part confuse you if you start seeing them in code samples and documentation. It is just a stylistic choice

Prompt Template with pure strings

Actually, you do not even need to have variables in a PromptTemplate for it work, though I am not sure there is a point using PromptTemplates for such a case. Might as well just pass a plain string to the LLM. This is how that looks

prompt_template = PromptTemplate.from_template(
'Tell me a funny joke about elephants.'
)
print(prompt_template.format())
# -> 'Tell me a funny joke about elephants.'

Prompt Template with variables

This is the much more common use case for prompt templates. We have a template string, which we want to dynamically be able to replace certain parts of that string only

The variable parts in the template are surround by curly brackets { }, and to fill these parts we pass in a list of key-value pairs (kwargs in python) with the variable name and text they should be filled with to the format() method on the Prompt Template.

prompt_template = PromptTemplate.from_template(
'Tell me a {adjective} joke about {content}'
)
print(prompt_template.format(adjective='funny', content='chickens'))
# -> 'Tell me a funny joke about chickens.'

Creating Prompt Templates via the Constructor

We can also create a PromptTemplate via the constructor, instead of via the from_template method

prompt_template = PromptTemplate(
template='Tell me a {adjective} joke about {content}',
input_variables=['adjective', 'cotent']
)
print(prompt_template.format(adjective='funny', content='chickens'))
# -> 'Tell me a funny joke about chickens.'

Prompt Templates with Multiline Strings and Variables

This is actually just the same as a normal string. You just need to include triple quotes at the start and end and voila. Single or double quotes work the same

template = '''You are a joke generating chatbot.
Provide funny jokes based on the themes requested.

Question: Tell me a {adjective} joke about {content}

Answer: '''

prompt_template = PromptTemplate.from_template(template)
print(prompt_template.format(adjective='funny', content='chickens'))

Prompt Templates with f-strings and Variables

These come along all the time, and are probably the foundation for my confusion with template strings, which probably points at a lack of python knowledge, v.s anything actually do to with LangChain

prompt_template = PromptTemplate.from_template(
f'Tell me a {{adjective}} joke about {{content}}'
)
print(prompt_template.format(adjective="funny", content="chickens"))

The trick with these is that the variables need to be surrounded with double quotes. The IDE does highlight these highlight these parts also, making them easier to spot out.

These reason for that is because f-strings parse anything inside curly brackets as python expressions. If you want to include curly brackets inside an f-string, as per the python docs:

If you need to include a brace character in the literal text, it can be escaped by doubling: {{ and }}.

A nice article with more details on f-strings is here also.

So the only reason to use f-strings in PromptTemplates would be if there is some external variable or python expression you would like to include in the PromptTemplate. This could be a legitimate use case based on whatever you are building at the time.

If you do not have a use for of those though, you are better off just using a normal string literal.

Just to demonstrates what that might look like, here is an example where we may want to include the date in our prompt template

from datetime import date

today = date.today()

prompt_template = PromptTemplate.from_template(
f'Todays Date: {today}: Tell me a {{adjective}} joke about {{content}}',
template_format='f-string'
)
print(prompt_template.format(adjective="funny", content="chickens"))
# -> 'Todays Date: 2024-04-01:Tell me a funny joke about chickens.'

Prompt Templates with Multiline f-strings and Variables

This is just the same as the above, except again, we use triple quotes for the start and end of multiline strings, even if they are f-strings

template = f'''You are a joke generating chatbot.
Provide funny jokes based on the themes requested.

Question: Tell me a {{adjective}} joke about {{content}}'

Answer: '''

prompt_template = PromptTemplate.from_template(template)
print(prompt_template.format(adjective='funny', content='chickens'))

Using a PromptTemplate in LLM Calls

No guide is complete without making some calls to an LLM right?

One of the use cases for PromptTemplates in LangChain is that you can pass in the PromptTemplate as a parameter to an LLMChain, and on future calls to the chain, you only need to pass in the variables you want to substitute for.

prompt_template = PromptTemplate.from_template(
'Tell me a {adjective} joke about {content}'
)

openai = ChatOpenAI(
model_name='gpt-3.5-turbo-16k',
openai_api_key='sk-'
)

chain = LLMChain(llm=openai, prompt=prompt_template)

response = chain.invoke(
input={'adjective': 'scary', 'content': 'French'}
)

print(response['text'])
# -> Why did the French ghost refuse to go into haunted houses?
# -> Because he had déjà boo!

Conclusion

Hope that was helpful!

Did I miss out on any PromptTemplate use cases you use use a lot, or that you might like me to add to the list above? Just let me know, and would be happy to update the above list based on some of your use cases.

Happy hacking!

If you enjoyed the article, and would like to stay up to date on future articles I release about building things with LangChain and AI tools, do hit the notification button so you can receive an email when they do come out.

Also, am working on a LangChain course for web devs to help you get started building apps around Generative AI, Chatbots, Retrieval Augmented Generation (RAG) and Agents. If you liked my writing style, and the content sounds interesting, you can sign up here

--

--