ChatGPT Template using Python-LangChain
Have you realized that today we really have to ask OpenAI to do repetitive tasks and sometimes we ask to do analytical things.
This article will show you how to customize ChatGPT using our own template and prompt to do some analysis.
OpenAI has released several models such as GPT-3.5, GPT-4, DALL-E, etc. They are built to handle different use cases and here we want to focus on model GPT-3.5 or 4 to generate results based on a custom template.
If you are new in GPT world, I think you need to try to use ChatGPT first before starting our development. (Link ChatGPT: https://chat.openai.com/)
Normally, we need to define our prompt every time we want to chat with GPT. Here is the prompt example: “you are an expert in fashion. Could you provide 5 trending synonyms of puffer coats? Please split the result using comma delimiter”. This prompt template is asking ChatGPT to become fashion expert and find trending synonyms related jaket/coats type.
Imagine how repetitive this task is if you need to ask another question for other jacket types (i.e. blazer, faux coat, sweater, etc.) and we need to remind the ChatGPT to provide the result in proper format.
Hence, we need to automate this using a template and pass the prompt inside it.
OpenAI API Key
OpenAI sells APIs access and trial for small dollars. This is a mandatory step because we need to have an API Key from ChatGPT before continuing to the next step. Our Python script will call the API key and pass the prompt accordingly.
Save this API Key to System Variable so we do not need to hardcode the key inside the Python script later.
Let’s install openai in our Python Environment
pip install openai==0.27.9
LangChain
LangChain is a framework for developing applications powered by language models. It supports Python as the programming language to build the LLM. I recommend to use LangChain because it is one of the best framework to build LLM and CustomLLM.
Let’s set up LangChain in our Python environment.
pip install langchain==0.0.272
Development
Once we have installed all the dependencies, let’s start the coding:
LangChain and OpenAI Library
We are using conversation buffer memory because we want to save existing chat to the memory and chain one conversation with another to make sure our model can learn every time we insert new prompt.
from langchain.chat_models import ChatOpenAI
from langchain.memory import ConversationBufferMemory
from langchain.chains import LLMChain
from langchain.prompts import (
ChatPromptTemplate,
MessagesPlaceholder,
SystemMessagePromptTemplate,
HumanMessagePromptTemplate,
)
import os
LLM Function
Our LLM is using GPT-3.5 and passing the API Key via system variables. LangChain template is defining the model to be an expert in fashion, so we can build the AI as similar as possible like human behavior. Lastly, we pass the question parameter in the template based on our prompt.
def define_model():
# LLM
llm = ChatOpenAI(
model_name="gpt-3.5-turbo",
openai_api_key=os.environ.get('CHATGPT_API_KEY'),
)
# Prompt
prompt = ChatPromptTemplate(
messages=[
SystemMessagePromptTemplate.from_template(
f"You are a formal assistant that expert in fashion taxonomy. "
f"You know how to find the trending keywords for fashion products. "
f"Remove any explanation or description. Please use comma as separator of the result and no jokes answer."
),
# The `variable_name` here is what must align with memory
MessagesPlaceholder(variable_name="chat_history"),
HumanMessagePromptTemplate.from_template("{question}"),
]
)
# Notice that we `return_messages=True` to fit into the MessagesPlaceholder
# Notice that `"chat_history"` aligns with the MessagesPlaceholder name
memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)
conversation = LLMChain(llm=llm, prompt=prompt, verbose=False, memory=memory)
return conversation
Find Keyword Similarity
This function is the way how we consume our conversation model and pass the product name to our template. So, we just need to change the product name parameter if we want to ask another question.
def find_keyword(conversation, product_name):
# Get OpenAI Response
response = conversation(
{
"question": f"what are the trending synonyms of {product_name}?"
})
list_keyword = response['text']
print("trending keywords of ", product_name, ": ", list_keyword)
return list_keyword
Main Program
Let’s call the LLM and ask some questions:
if __name__ == "__main__":
conv = define_model()
find_keyword(conv, "puffer coat")
find_keyword(conv, "faux fur coat")
find_keyword(conv, "blazer coat")
LLM Results:
C:\Users\dmitr\Py-LangChain-ChatGPT-VirtualAssistance\Scripts\python.exe "D:\00 Project\00 My Project\IdeaProjects\Py-LangChain-ChatGPT-VirtualAssistance\00_LangChainPromptTemplate.py"
trending keywords of puffer coat : puffer jacket, down jacket, quilted coat, padded coat, bubble coat, parka, winter coat
trending keywords of faux fur coat : faux fur jacket, faux fur parka, faux fur vest, faux fur stole, faux fur cape, faux fur shawl, faux fur trim coat
trending keywords of blazer coat : blazer jacket, tailored coat, structured coat, suit jacket, formal coat, dress coat, blazer-style coat
Process finished with exit code 0
Summary
Python, OpenAI, and Langchain collectively represent a powerful synergy in the realm of artificial intelligence and programming. OpenAI has made significant strides in advancing AI technology, most notably through its development of large language models like GPT-4. These models, which are capable of understanding and generating human-like text, can be integrated into Python-based applications, enabling a wide range of AI-powered features. Langchain is a tool that combines language models with chain-of-thought reasoning, enhancing the capabilities of language models in complex problem-solving tasks.
This integration allows for more sophisticated AI applications, as it harnesses the computational power of Python, the advanced language understanding of OpenAI’s models, and the logical reasoning facilitated by Langchain. Together, these technologies are pushing the boundaries of what’s possible in AI, offering developers and researchers innovative tools to create more intelligent, responsive, and capable AI systems.
GitHub Repository
If you like this article please subscribe and give more stars.
Thank you