OpenAI Function Calling Examples

Setting the foundation for AGI composed of Autonomous Agents

Ivan Campos
Sopmac AI
7 min readJun 27, 2023

--

In this article, we will discuss the following:

  • The role that OpenAI functions may play on the road to AGI
  • How to enable OpenAI Function Calling via the openai Python module
  • Code example to use OpenAI Functions to output structured JSON
  • Code example to use OpenAI Functions to call an external API

AGI

Artificial General Intelligence (AGI) will be reached when AI is able to surpass human level intelligence. This will mean that the AGI can learn from experience, understand context, solve diverse problems, exhibit creativity and intuition, and apply knowledge across domains.

Agents

At a recent hackathon, Harrison Chase (the co-founder and CEO of LangChain) discussed how we may get to AGI:

“AGI is gonna be a bunch of small Agents working together.” — via @mokoseth

In the context of AGI, an Agent is a type of AI that can operate and make decisions independently — typically without the need for human intervention. Ultimately, the Agents are capable of perceiving their environment, interpreting collected data, making decisions based on this data, and then acting on these decisions to achieve their goal.

“Tools are interfaces that an agent can use to interact with the world.” — LangChain

OpenAI Functions

OpenAI Functions provide similar functionality to LangChain Tools. The Functions allow for interaction with the outside world.

The expectation is that these functions will be utilized by Agents and these Autonomous Agents will eventually work together to enable AGI.

Functions → Agents → Artificial General Intelligence

While AGI remains theoretical, OpenAI Functions can be applied today. With these Functions, developers can integrate the OpenAI API with external APIs and perform intelligent function calls.

GPT Function Calling

The OpenAI API exposes the latest GPT models (gpt-3.5-turbo-0613 and gpt-4-0613) with function calling to:

  1. Generate and return structured content as JSON
  2. Return a function name that can be caught to allow for the augmentation of the generative text or call an external API

OpenAI Functions Specs

The OpenAI API supports the creation of one or many functions — with each function containing the following:

  • name (string) Required: The name of the function to be called.
  • description (string) Optional: The description of what the function does. Treat these like well-crafted prompts.
  • parameters (object) Optional: The parameters the functions accepts, described as a JSON Schema object.

If you’re looking to jumpstart the creation of your parameters object, use the following ChatGPT prompt to create the first draft of your schema:

“generate a schema for {insert your topic and fields} that adheres to the JSON Schema specification from json-schema.org”

function_call

With the function_call property, you can have it set to auto (default) which allows the model to determine whether a function should be called based on the description property or just process without calling a function.

You can also set the value to the name of a function ({“name”: “related_subreddits”}) to force this named function to always be used.

Lastly, the value can also set it to none to ensure that the model never uses a function.

Example 1: Structured Output

import openai
from tenacity import retry, wait_random_exponential, stop_after_attempt

system_prompt = """
You are a helpful assistant.
"""

@retry(wait=wait_random_exponential(min=1, max=40), stop=stop_after_attempt(3))
def chat_with_functions(user_prompt: str, model_name: str = "gpt-4-0613"):
completion = openai.ChatCompletion.create(
model=model_name,
messages=[
{"role": "system", "content": system_prompt},
{"role": "user", "content": user_prompt}
],
functions=[
{
"name": "related_subreddits",
"description": "List of the top subreddits from reddit.com based on the topic provided.",
"parameters": {
"type": "object",
"properties": {
"links": {
"type": "array",
"items": {"type": "string"},
"minItems": 5,
"maxItems": 10,
"description": "URL to subreddit from reddit.com related to the topic provided. These are real sites. Do not make up URLs."
}
},
"required": ["links"]
}
},
],
function_call="auto",
temperature=0,
)

message = completion.choices[0].message
function_used = hasattr(message, "function_call")
return message.function_call.arguments if function_used else message.content

Here’s a breakdown of what the code is doing:

  1. It first imports the necessary libraries: openai and tenacity.
  2. It then defines a system prompt, which sets the context of the conversation for the AI.
  3. The @retry decorator from the tenacity library is used to automatically retry the chat_with_functions function if an exception is raised. It uses a random exponential backoff strategy for the wait time between retries (between 1 to 40 seconds), and stops retrying after 3 attempts.
  4. The chat_with_functions function creates a new chat completion using the openai.ChatCompletion.create method. This method takes the following parameters:
  • model: the model to use for the chat, which is set to "gpt-4-0613". Starting on June 27, 2023, both “gpt-4” and “gpt-3.5-turbo” will support OpenAI functions.
  • messages: a list of messages to seed the chat. The first message is from the "system" and sets the initial context ("You are a helpful assistant."). The second message is from the "user" and contains the user's input.
  • functions: a list of functions that the model can call during the conversation. In this case, a function called "related_subreddits" is defined which takes an array of strings as input and returns a list of subreddits.
  • function_call: set to "auto", which means the model will decide when to call the function.
  • temperature: set to 0, which means the model's output will lack randomness, choosing the most likely completion at each step.

5. The function then extracts the AI’s response from the completion. If the response includes a function call, it returns the arguments of the function call. If not, it returns the content of the AI’s message.

If you call chat_with_functions(“where in reddit to learn about AI?”), you will receive the following inside the completion variable:

{
"id": "chatcmpl-7VbrTkESIjWi7QORSj6qnrK0Sl000",
"object": "chat.completion",
"created": 1687768063,
"model": "gpt-4-0613",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": null,
"function_call": {
"name": "related_subreddits",
"arguments": "{\n \"links\": [\"https://www.reddit.com/r/artificial/\", \"https://www.reddit.com/r/MachineLearning/\", \"https://www.reddit.com/r/learnmachinelearning/\", \"https://www.reddit.com/r/deeplearning/\", \"https://www.reddit.com/r/ArtificialInteligence/\"]\n}"
}
},
"finish_reason": "function_call"
}
],
"usage": {
"prompt_tokens": 96,
"completion_tokens": 74,
"total_tokens": 170
}
}

What this demonstrates is the use of generative text to create a structured JSON response. In this case, there is a function_call named related_subreddits being used and its arguments are populated with valid values that adhere to the function’s schema.

An alternative method is to use the schema to classify and store request parameter(s) that you can then use to call some code or an external API when the function_call is used.

Example 2: Calling an external service

import openai
from tenacity import retry, wait_random_exponential, stop_after_attempt

system_prompt = """
You are a helpful assistant.
"""

@retry(wait=wait_random_exponential(min=1, max=40), stop=stop_after_attempt(3))
def chat_with_functions(user_prompt: str, model_name: str = "gpt-4-0613"):
completion = openai.ChatCompletion.create(
model=model_name,
messages=[
{"role": "system", "content": system_prompt},
{"role": "user", "content": user_prompt}
],
functions=[
{
"name": "stock_info",
"description": "get financial information about a given company based on its stock ticker.",
"parameters": {
"type": "object",
"properties": {
"ticker": {
"description": "The ticker symbol of the stock.",
"type": "string"
}
},
"required": ["ticker"]
}
},
],
function_call={"name": "stock_info"},
temperature=0,
)

message = completion.choices[0].message
function_used = hasattr(message, "function_call")
return message.function_call.arguments if function_used else message.content

Note that function_call is set to the name of the function. This will force the chat completion endpoint to use the specified function — in this case stock_info.

Now, if you call chat_with_functions("tell me about AAPL"), you should receive:

{\n  "ticker": "AAPL"\n}

With the ticker symbol extracted from the user_prompt, you can pass this ticker into an external API (e.g. yahoo finance) call:

import yfinance as yf
stock_info = yf.Ticker(json.loads(chat_response).get("ticker"))

It’s worth reinforcing that you can provide multiple functions and have the model determine (function_call="auto") which function to call or to call no function at all and just provide a direct response from the model.

@retry(wait=wait_random_exponential(min=1, max=40), stop=stop_after_attempt(3))
def chat_with_functions(user_prompt: str, model_name: str = "gpt-4-0613"):
completion = openai.ChatCompletion.create(
model=model_name,
messages=[
{"role": "system", "content": system_prompt},
{"role": "user", "content": user_prompt}
],
functions=[
{
"name": "related_subreddits",
"description": "List of the top subreddits from reddit.com based on the topic provided.",
"parameters": {
"type": "object",
"properties": {
"links": {
"type": "array",
"items": {"type": "string"},
"minItems": 5,
"maxItems": 10,
"description": "URL to subreddit from reddit.com related to the topic provided. These are real sites. Do not make up URLs."
}
},
"required": ["links"]
}
},
{
"name": "stock_info",
"description": "get financial information about a given company based on its stock ticker.",
"parameters": {
"type": "object",
"properties": {
"ticker": {
"description": "The ticker symbol of the stock.",
"type": "string"
}
},
"required": ["ticker"]
}
},
],
function_call="auto",
temperature=0,
)

message = completion.choices[0].message
function_used = hasattr(message, "function_call")
return message.function_call.arguments if function_used else message.content

The primary difference between the chat_with_functions(user_prompt) examples is that:

  • The first example outputs text from generative AI in a structured format
  • The second example uses OpenAI functions as a router to make a call to an external API
  • The final iteration of the code demonstrates having a collection of functions comprised of the first two examples

Conclusion

Once we have our OpenAI Functions created, we can now begin creating Agents to use these Functions.

I will leave you with the following notes about Agents and AGI from Andrej Karpathy (previously Director of AI @ Tesla and now @ OpenAI) at the same hackathon mentioned previously:

“AGI’s form factor likely to look like many autonomous agents” — via @dina_yrl

References

--

--

Ivan Campos
Sopmac AI

Exploring the potential of AI to revolutionize the way we live and work. Join me in discovering the future of tech