Prompt Templates with Azure OpenAI

The Power of Prompt Engineering

Akshay Raut
3 min readApr 3, 2024

What is a Prompt?

“If LLMs were genies in a bottle, prompts would be your wishes. Be careful what you wish for (write carefully), because the LLM tries its best to fulfill it!”

Well , Prompts are basically the text input to the LLMs. Anything you are writing to an LLM is a prompt.

What are Prompt Templates?

Imagine giving your Large Language Model (LLM) clear instructions and context to guide its responses. Prompt Templates offer a structured way to achieve this. They act as blueprints, allowing you to define:

  • Instructions: Specify the desired LLM behavior (e.g., translate text, write a review).
  • Context: Provide relevant information or examples to enhance understanding.
  • Input: Indicate where the user’s input will be integrated within the prompt.
Prompt Template Example

Why are Prompt Templates Important?

Effective prompts are crucial for obtaining accurate and relevant outputs from LLMs. Prompt Templates streamline this process by:

  • Enhancing Clarity: Consistent structure ensures clear communication with the LLM.
  • Simplifying Complex Tasks: Break down intricate tasks into digestible prompts for the LLM.
  • Reusability: Easily adapt existing templates for new scenarios, saving development time.

Example: Creating a Cricket Expert

With our environment in place, we can explore Prompt Templates. Here’s how to create a prompt that transforms your LLM into a cricket Expert :

Remember:

Have your Azure OpenAI service set up and your .env file with the correct API keys in place before running this code.

#app.py

from langchain import PromptTemplate
import openai
from dotenv import load_dotenv, find_dotenv
import os
from langchain.chat_models import AzureChatOpenAI
from dotenv import find_dotenv, load_dotenv


load_dotenv(find_dotenv())

OPENAI_API_KEY = os.getenv("AZURE_OPENAI_API_KEY")
OPENAI_API_TYPE = os.getenv("AZURE_OPENAI_API_TYPE")
OPENAI_API_BASE = os.getenv("AZURE_OPENAI_API_BASE")
OPENAI_API_VERSION = os.getenv("AZURE_OPENAI_API_VERSION")


openai.api_type = OPENAI_API_TYPE
openai.api_base = OPENAI_API_BASE
openai.api_version = OPENAI_API_VERSION
openai.api_key = OPENAI_API_KEY


llm = AzureChatOpenAI(
openai_api_version=OPENAI_API_VERSION,
openai_api_key=OPENAI_API_KEY,
openai_api_base=OPENAI_API_BASE,
openai_api_type=OPENAI_API_TYPE,
deployment_name="your_deployment_name",
temperature=0.7
)


#Azure OpenAI Completion Endpoint
def get_completion(prompt):
messages = [{"role": "user", "content": prompt}]
chat_completion = openai.ChatCompletion.create(
deployment_id="your_deployment_name",
messages=messages,
temperature=0.7,
max_tokens=1024,
n=1,
)

return chat_completion.choices[0].message.content

# create the prompt
prompt_template: str = """/
You are a cricket expert, give responses to the following/
question: {question}. Do not use technical words, give easy/
to understand responses.
"""

prompt = PromptTemplate.from_template(template=prompt_template)

# Questions
questions = [
"who won the 2011 world-cup?",
"who won big-boss 17?"
]

# Iterate through the questions
for question in questions:
prompt_formatted_str = prompt.format(question=question)
prediction = llm.predict(prompt_formatted_str)

print("Question:", question)
print("Answer:", prediction)
print("-----------------") # separator between results

Result :

LLM Result

Key Points :

  • Prompt Structure: The template clearly defines the LLM’s role as a cricket expert and gives instructions on how to format its response.
  • Customizable Questions: The {question} placeholder lets you easily swap in different cricket-related questions.

Conclusion

That’s all for this article, see you in the next article in the series.

If you enjoy this articles consider following me for more 🙂 .

--

--

Akshay Raut

Python | Azure OpenAI | FastAPI | Microsoft Bot Framework , While you read this I’m learning something new.