Exploring OpenAI’s Latest V2 (Beta) API: Assistants and Tools.

Tim Rizvanov
5 min readJun 29, 2024

--

Introduction

Hi Everyone!

OpenAI’s latest V2 (Beta) API has undergone significant changes, particularly in how services are accessed. The idea, as it appears to me, is to provide a more streamlined, “one-stop shop” solution.

With this new setup, a single assistant can handle a wide variety of tasks, including combining custom tools and even (this is a big one) vector databases stored within OpenAI!

While reading the documentation and provided examples of the updated product, I found them lacking of real-life context. For instance, the Tools JSON structure defines the names of functions it may call and their values. However, it doesn’t mention that these functions still need to be written.

After spending a couple of days achieving results that I consider to be adequate, I decided to share my experience to save others some time. This article aims to provide a relatively short, fuss-free guide based on hours of hands-on experience and troubleshooting.

We will walk through creating an assistant, setting up various functions, and handling threads and messages using an example, closer to what one may use in real life. Whether you’re new to OpenAI or looking to explore the latest features, this guide is designed to help you get started quickly and efficiently. For those interested in vector database creation and storage, stay tuned for an upcoming article that will cover that in detail.

One important caveat: At the time of writing, the v2 API is still in Beta, so examples provided here may not work the same way in the future. I will, however, do my best to keep it up to date.

Steps to Create and Use an Assistant

  1. Set Up the Environment
  • Install necessary libraries and set your API key.

2. Create an Assistant

  • Define your assistant with specific instructions and tools.

3. Create a Thread

  • Initiate a conversation thread for interactions.

4. Send a Message

  • Communicate with the assistant by sending a message.

5. Run and Poll the Thread

  • Execute the thread and retrieve responses.

6. Submit Tool Outputs

  • Handle tool outputs required by the assistant.

7. View Assistants

  • Manage your created assistants on the OpenAI Platform.

Step 1: Set Up the Environment

Install the OpenAI Python library and set your OpenAI API key:

pip install openai
import os
from openai import OpenAI

GPT_MODEL = "gpt-3.5-turbo"
OPENAI_KEY = os.environ['OPENAI_API_KEY']
client = OpenAI(api_key=OPENAI_KEY)
tools_assistant = client.beta.assistants.create(
name="Dates Assistant",
instructions="You are a helpful assistant who can use tools and answer questions about dates. You will output dates in dd/mm/yyyy format",
model="GPT_MODEL",
temperature=0.5,
response_format="auto",
tools=[
{
"type": "function",
"function": {
"name": "get_todays_date",
"description": "Get today's date",
"parameters": {
"type": "object",
"properties": {},
"required": []
},
}
},
{
"type": "function",
"function": {
"name": "get_date_range_n_days_from_today",
"description": "Get the date range starting n days from today",
"parameters": {
"type": "object",
"properties": {
"n": {
"type": "integer",
"description": "Number of days from today to start the range"
}
},
"required": ["n"]
},
}
}
]
)

Note: It is worth playing around temperature setting until you’re happy with the result. Generally, I’d start by setting it quite low (0.1 for example), do a couple of runs, then bump the value closer to 1, which would generally make the model become very adventurous (it’s fun!). I then dial it back down to around about 0.5 which seems to be a happy medium for using Tools. I digress …

Functions for Date Operations

Here are the functions that the assistant will use to perform the actual tasks:

from datetime import datetime, timedelta

def get_todays_date():
today = datetime.now()
formatted_date = today.strftime('%d/%m/%Y')
print(f"Debug: get_todays_date() = {formatted_date}")
return formatted_date

def get_date_range_n_days_from_today(n):
start_date = datetime.now() + timedelta(days=n)
end_date = start_date + timedelta(days=n)

start_date_formatted = start_date.strftime('%d/%m/%Y')
end_date_formatted = end_date.strftime('%d/%m/%Y')

date_range = f"{start_date_formatted} - {end_date_formatted}"
print(f"Debug: get_date_range_n_days_from_today({n}) = {date_range}")
return date_range

Parallel Function Calling

OpenAI’s V2 (Beta) API handles parallel execution of these functions automatically. This means your assistant can manage multiple tasks at once, improving efficiency. If for whatever reason, you prefer sequential execution over parallel, you can always configure this in the assistant settings — check out the OpenAI Function Calling Guide for details.

Step 3: Create a Thread

Start a conversation by creating a thread:

thread = client.beta.threads.create()

Step 4: Send a Message

Send a message to the assistant within the thread. The content is a fairly conversational, which is why earlier, we instructed the model to be more creative.

message = client.beta.threads.messages.create(
thread_id=thread.id,
role="user",
content="Give me a list of dates in August 2024 that fall within a period of 1 week."
)

print(message)

Step 5: Run and Poll the Thread

Execute the thread and retrieve the assistant’s response:

run = client.beta.threads.runs.create_and_poll(
thread_id=thread.id,
assistant_id=tools_assistant.id,
)

if run.status == "completed":
messages = client.beta.threads.messages.list(thread_id=thread.id)
for message in messages:
print({"role": message.role, "message": message.content[0].text.value})
else:
print("Run status:", run.status)

Step 6: Submit Tool Outputs

Handle the tool outputs required by the assistant. Here’s how we can process and submit the tool outputs based on the functions:

tool_outputs = []

for tool in run.required_action.submit_tool_outputs.tool_calls:
if tool.function.name == "get_todays_date":
tool_outputs.append({"tool_call_id": tool.id, "output": get_todays_date()})
elif tool.function.name == "get_date_range_n_days_from_today":
n = json.loads(tool.function.arguments)["n"]
tool_outputs.append({"tool_call_id": tool.id, "output": get_date_range_n_days_from_today(n)})

if tool_outputs:
try:
run = client.beta.threads.runs.submit_tool_outputs_and_poll(
thread_id=thread.id,
run_id=run.id,
tool_outputs=tool_outputs
)
print("Tool outputs submitted successfully.")
except Exception as e:
print("Failed to submit tool outputs:", e)
else:
print("No tool outputs to submit.")

Step 7: View Assistants

We can view and manage the assistants we created by visiting OpenAI Platform.

Worth remembering that once you create an assistant, it will persist within the Platform. While still trying the system, I would generally delete the generated assistant so to avoid confusion.

Once you find Assistant that works well for your needs, you can simply retrieve it using one’s ID. I will cover managing assistants in one of the upcoming articles.

Conclusion

Well that’s all I wanted to share this time around. I personally think that this is a big step forward for OpenAI and make lives those who are just starting their journey in building A.I. systems a bit easier.

Stay tuned for more on advanced features like vector database creation and storage.

Thank you for reading!

--

--

Tim Rizvanov

An engineer aiming for the Goldilocks zone of innovation, where 'How do I get it done?' transforms into 'What do I want to achieve?'