Leveraging LLM Power: Integrating Azure OpenAI with Custom Tools

Anupkumar Ghogare
4 min readSep 3, 2024

--

Introduction

In the fast-paced domain of artificial intelligence, it is essential to utilize tools that augment the capabilities of Large Language Models (LLMs). A particularly potent combination is the integration of Azure OpenAI with custom functions. This article dives into the methodology for generating and executing SQL queries on an existing database using this powerful synergy.

What is Azure OpenAI?

Azure OpenAI provides access to OpenAI’s powerful language models through the Azure platform. It allows developers to integrate advanced AI capabilities into their applications, enhancing productivity and innovation. For more detailed information on Azure OpenAI, please visit the official documentation.

Introducing Function Calling (Tools) with LLM

The newest iterations of Large Language Models have been fine-tuned to seamlessly integrate with functions. They can intelligently decide both when and how to invoke a function. When your request includes one or more functions, the model evaluates the context to determine if any function should be called. When it determines which function needs to be called, it responds back with the JSON object including the function name and the arguments for the function.

Working with functions can be broken down into main steps as below:

  1. Define a function in your codebase for the model to call.
  2. Explain the function to the model.
  3. Provide the function definitions and messages to the model as available “tools.”
  4. Use the model’s response to call your API or function.
  5. Call the chat completions API again with the function’s result to get the final answer.

In this post, we will create an SQL Assistant using the `GPT-4o` model and define the custom function to execute SQL queries. The model should be able to understand the context and decide whether or not to execute the provided custom function. The combination of the defined function and LLM Model should allow users to:

  • Generate SQL queries based on natural language questions.
  • Execute these queries on the database.
  • Retrieve and present the results in a user-friendly manner.

Approach

Step 1: Create a Python function to execute queries and explain it to LLM

Define a Python function to connect to the MySQL database and execute the query against it. Explain this function to the LLM and instruct it to use the function as an available tool.

a. define function -

def exec_sql(sql_stmt: str):
with sqlconn.cursor() as cur:
cur.execute(sql_stmt)
result = cur.fetchall()

if len(result) > 0:
columns = [desc[0] for desc in cur.description]
result_dict = [dict(zip(columns, row)) for row in result]

return result_dict
return None

b. let LLM know about the function -

sql_tools = [
{
"type": "function",
"function": {
"name": "exec_sql",
"description": "Execute sql query and return result dict object",
"parameters": {
"type": "object",
"properties": {
"sql_stmt": {
"type": "string",
"decription": "sql query",
},
},
"reauired": ["sql_stmt"],
}
}
}
]

c. make the function available to model

chat_completion = client.chat.completions.create(
model = OPEN_AI_MODEL,
messages = messages,
temperature = 0,
tools = sql_tools,
tool_choice = "auto",
top_p = 0.95,
frequency_penalty = 0,
presence_penalty = 0,
timeout = 200
)

Step 2: Create a prompt to instruct the LLM and train the LLM on Database Metadata

The first step involves training the LLM by including the database metadata in the prompt. This includes information about the tables, columns, data types, and relationships within the database. Create an effective prompt to instruct the LLM to adopt the persona of a SQL Assistant. This assistant should construct SQL queries based on the provided metadata and the user input.

prompt = f"""
You are a SQL AI assistant that helps write complex SQL queries. Use the metadata information below to build SQL queries.
Add the schema name {MYSQL_DB} to all table names in the generated query. You have access to the tool 'exec_sql' to execute
queries and fetch results. Parse the results returned by the tool and convert them into a tabular format. Limit your
queries to the provided dataset.

sql engine: mysql

{sql_setup_prompt}

Output format -
sql statement:
result:
"""

Step 3: Ask Questions About the Data and use model response to invoke the function

Once the LLM is provided with the tool, users can ask questions about the data. The LLM will interpret these questions to construct the appropriate SQL queries and make a call whether it needs to invoke any function or not. Based on the LLM response, we can execute the function to execute LLM-generated SQL queries and provide the function result back to the model.

 if response_message.tool_calls:
for tool_call in response_message.tool_calls:

if tool_call.function.name == "exec_sql":
function_args = json.loads(tool_call.function.arguments)

result = exec_sql(
sql_stmt = function_args.get("sql_stmt")
)

messages.append({
"tool_call_id": tool_call.id,
"role": "tool",
"name": "sql_assistant",
"content": str(result),
})

Step 4: Get the final response from the model

Add the function result back to the model input and wait for the final response from the model. The model will parse the function result based on the user input (question) and provide the final response. The final response would be the actual data from the database which is nothing but the result of the executed query.

You can find the complete source code in this notebook: SQL-AIX

Here is the response from model for one of the questions -

user_query = "Find top 3 products sold"
get_llm_response(prompt, user_query, sql_tools)

LLM response -

Conclusion

Incorporating tools (function calling) significantly enhances the capabilities of large language models (LLMs), enabling them to achieve even more.

--

--