Exploring the Power of OpenAI’s Chat Completion API: A Beginner’s Guide

Hira Ahmad
The Deep Hub
Published in
2 min readApr 2, 2024

OpenAI’s Chat Completion API offers a fascinating glimpse into the world of conversational AI, enabling seamless interactions between users and AI models. Let’s embark on a journey to understand its capabilities, starting with basic examples and gradually advancing to more complex scenarios.

OpenAI’s Chat Completion API

Simple Prompt and Response

In this basic example, we’ll start with a simple prompt and observe the response generated by the AI model.

# Basic Prompt
prompt = "What is the capital of Pakistan?"

# Call Chat Completion API
response = client.chat.completions.create(model="gpt-3.5-turbo", messages=[{"role": "user", "content": prompt}])
# Display AI assistant's response
print(response.choices[0].message.content)

Separate Prompt and Call

Next, let’s separate the prompt from the call to the Chat Completion API, allowing for more flexibility in crafting the conversation flow.

# Define Prompt
prompt = "Tell me about the weather in Islamabad."

# Call Chat Completion API
response = client.chat.completions.create(model="gpt-3.5-turbo", messages=[{"role": "user", "content": prompt}])
# Display AI assistant's response
print(response.choices[0].message.content)

Defining Functions and External API Calls

Now, let’s introduce functions to handle specific tasks, such as fetching weather information from an external API, and seamlessly integrate them into our conversational AI.

import requests

# Function to Get Weather Information
def get_weather(city):
api_key = "YOUR_API_KEY"
url = f"https://api.openweathermap.org/data/2.5/weather?q={city}&appid={api_key}"
response = requests.get(url)
data = response.json()
weather_description = data['weather'][0]['description']
return f"The weather in {city} is currently {weather_description}."

# Define Prompts
prompts = [
"What's the weather like in Islamabad?",
"How's the weather in Lahore today?",
"Tell me about the weather in Karachi City."
]

# Call Chat Completion API with Multiple Prompts and External API Calls
for prompt in prompts:
# Call Chat Completion API
response = client.chat.completions.create(model="gpt-3.5-turbo", messages=[{"role": "user", "content": prompt}])

# Extract city from prompt
city = prompt.split()[-2]

# Get weather information using external API
weather_info = get_weather(city)

# Display AI assistant's response and weather information
print(f"User: {prompt}")
print(f"AI: {response.choices[0].message.content}")
print(f"Weather Info: {weather_info}")

Conclusion

As we’ve explored the Chat Completion API, we’ve witnessed its evolution from basic prompts to advanced interactions involving external API calls and function definitions. By mastering these techniques, developers can create rich and engaging conversational experiences, unlocking the full potential of AI-driven interactions. With continued experimentation and innovation, the possibilities for transformative conversational AI are limitless, paving the way for a future where human-computer interactions are seamless and intuitive.

--

--