Building an Enterprise Ready Chatbot with OpenAI Assistants (GPTs) — Part II

Integrating Advanced Features and Custom APIs

Mark Craddock
Prompt Engineering

--

Introduction

Following on from Part 1. To build an enterprise-ready chatbot using OpenAI GPTs, it’s crucial to understand not only the fundamental architecture but also how to integrate advanced features and custom APIs for enhanced functionality. This article delves into these aspects, following our previous discussion on the basic architecture pattern.

Look out for Part III — How to build the API using Replit and CloudFlare.

Utilising OpenAI GPTs for Dynamic Conversations

GPTs, known for their deep learning capabilities, enable chatbots to understand and respond in a human-like manner. Implementing GPTs involves setting up the chatbot to process user inputs using the GPT model, which then generates appropriate responses based on its training. The implementation can be customised to suit specific enterprise needs, like customer support, personal assistants, or data analysis.

Prompt (Instructions)

Copy and paste the example instructions below for your GPT. Feel free to edit/update and improve this prompt.

As the Wardley Map Analyst, my primary role is to assist users in analyzing and visualizing Wardley Maps. When users upload a text file containing a Wardley Map, I will extract and directly pass the actual text of the map to the generateWardleyMap action using the text' parameter. This approach allows for a precise generation of the map's visual representation, essential for detailed analysis. I will interpret the components of the Wardley Map, providing strategic insights and discussing the purpose and impact of each element.

Custom API Integration

A notable aspect of building advanced chatbots is integrating custom APIs that extend the chatbot’s functionality beyond standard conversation. For instance, a chatbot can be empowered to generate Wardley Maps, an innovative concept in business strategy, through a specialised API.

Example API: Wardley Map Creation

Consider an API designed for creating a Wardley Map. Such an API would include endpoints for generating, saving, and retrieving Wardley Maps. Here’s a breakdown of the key components:

  • Generate Wardley Map (/v2/generate-map): This POST endpoint accepts a request containing the Online Wardley Map (OWM) in text format (text) and returns a URL of the generated map image.
  • Retrieve Generated Image (/v2/image/{filename}): A GET endpoint to retrieve the image file generated by the /generate-map endpoint.
  • Save a Wardley Map (/v2/maps/save): A POST endpoint that saves the provided Wardley Map data to create.wardleymaps.ai and returns a unique identifier and an edit URL for the map.
  • Get the Wardley Map (/v2/maps/fetch?id={id}) A GET endpoint that retrieves a saved Wardley Map from create.wardleymaps.ai using its unique identifier, provided in the request.
  • Privacy Statement (/v2/privacy): A GET endpoint that returns the privacy statement for the API, ensuring transparency and trust. All OpenAI GPTs that are public, must have a privacy statement.

Each of these endpoints is described in the OpenAPI 3.1.0 schema, ensuring standardisation and ease of integration.

You can find the documentation for the API here. Note: The API only accepts requests from OpenAI.

The swagger file for the API is available here.

I’ve currently made this API open for experimenting in building Wardley Map chatbots. It is rate limited.

Architectural Considerations

When integrating such APIs with a GPT-powered chatbot, it’s essential to consider the architecture. The chatbot should be able to handle API calls efficiently, managing both the input/output data and the conversational context. This requires a robust backend architecture capable of asynchronous processing and handling potentially large volumes of data.

Security and Privacy

In an enterprise environment, security and privacy are paramount. The chatbot and integrated APIs must adhere to stringent data protection standards, encrypting data in transit and at rest, and complying with regulations like GDPR.

Scalability and Reliability

Finally, scalability and reliability are critical. The system should be designed to handle varying loads, with the ability to scale up during peak usage. Reliability is also crucial, as downtime can significantly impact business operations.

Actions

To access the custom API a schema needs to be built and provided to the GPT assistant. This needs to provide details of you custom API to the GPT.

Schema Details

Here’s why the definition of the schema of an API endpoint is important:

  1. Clarity of Functionality: The description provides a clear understanding of what each API endpoint does. For instance, in the given example of a Wardley Map API, descriptions like “Generate a Wardley Map” immediately inform the GPT model about the purpose of each endpoint.
  2. Parameter Understanding: Detailed descriptions help the GPT model understand the parameters required by each endpoint. For example, knowing that the /v2/generate-map endpoint requires "text" (the Online Wardley Map in text format) as input is crucial for the GPT to prepare and send the correct data in its API requests.
  3. Response Handling: The description of expected responses enables the GPT model to anticipate and properly handle the data it receives. For example, if an endpoint returns a URL of an image, the GPT model knows to treat this as a link rather than as text or another data type.
  4. Error Management: Understanding the potential errors or default responses (as described in the API documentation) allows the GPT model to handle exceptions gracefully. This can include providing informative error messages to users or executing alternative actions when an API call fails. Errors response should be something the GPT can understand and take action.
  5. Contextual Relevance: Good endpoint descriptions enable the GPT model to use the API appropriately in various contexts. For instance, in a conversation about business strategy, the GPT might use the Wardley Map generation endpoint to provide visual strategy aids.
  6. Integration and Workflow: Detailed descriptions help in integrating the API seamlessly into the GPT’s workflow. The GPT model can understand when and how to make API calls as part of its larger task of conversational engagement or problem-solving.

Example Schema:

You can find the source here.

{
"openapi": "3.1.0",
"info": {
"title": "Create a Wardley Map",
"description": "API for creating an Online Wardley Map.",
"version": "v1.2.7"
},
"servers": [
{
"url": "https://api.wardleymaps.ai"
}
],
"paths": {
"/v2/generate-map": {
"post": {
"operationId": "generateWardleyMap",
"x-openai-isConsequential": false,
"summary": "Generate a Wardley Map",
"description": "Returns a URL of an image of a Wardley Map, passed as string.",
"requestBody": {
"required": true,
"content": {
"application/json": {
"schema": {
"type": "object",
"properties": {
"text": {
"type": "string",
"description": "The Online Wardley Map (OWM) in text format."
}
},
"required": [
"text"
]
}
}
}
},
"responses": {
"200": {
"description": "Successful response",
"content": {
"application/json": {
"schema": {
"$ref": "#/components/schemas/QueryResult"
}
}
}
}
}
}
},
"/v2/image/{filename}": {
"get": {
"operationId": "getImage",
"summary": "Retrieve a generated image",
"description": "Returns the image file generated by the /generate-map endpoint.",
"parameters": [
{
"name": "filename",
"in": "path",
"required": true,
"description": "The name of the image file to retrieve.",
"schema": {
"type": "string"
}
}
],
"responses": {
"200": {
"description": "Image file",
"content": {
"image/png": {}
}
}
}
}
},
"/v2/privacy": {
"get": {
"operationId": "getPrivacyStatement",
"summary": "Privacy Statement",
"description": "Returns the privacy statement for the API.",
"responses": {
"200": {
"description": "Privacy statement text",
"content": {
"text/plain": {}
}
}
}
}
}
},
"components": {
"schemas": {
"QueryResult": {
"type": "object",
"properties": {
"imageurl": {
"type": "string"
},
"warnings": {
"type": "array",
"items": {
"type": "string"
}
}
}
},
"WardleyMapResponse": {
"type": "object",
"properties": {
"id": {
"type": "string",
"description": "A unique identifier for the response."
},
"text": {
"type": "string",
"description": "The verified wardley map text."
}
},
"required": ["id", "text"]
}
}
}
}

Once you have pasted the schema into the OpenAI GPT, you should see the following API endpoints available for testing.

Testing

Testing these APIs is a fundamental step to ensure that the GPT performs as expected, remain reliable under various conditions, and provide a seamless user experience. Here’s an overview of the significance and methods of API testing:

Why API Testing is Essential

Functionality Verification: API testing ensures that the functionality provided by the API meets specified requirements. It verifies that the API performs expected operations, handles requests correctly, and returns the appropriate responses.

Reliability and Stability: Regular testing under different scenarios helps in identifying and fixing stability issues, ensuring the API can handle expected load and function consistently.

Performance Evaluation: Testing helps in evaluating the performance of APIs, including response times and throughput, which are crucial for applications that rely on real-time data processing and high-speed interactions.

Security Assurance: Security testing of APIs checks for vulnerabilities and ensures that data transmission is secure. This is vital in protecting sensitive information and maintaining user trust.

Security Testing: Includes testing for vulnerabilities, authentication and authorisation checks, and data encryption.

End-to-End Testing: Tests the entire workflow of the application with the API to ensure all integrated components function together seamlessly.

Publishing

When you are happy with it, publish your GPT.

Conclusion

Building an enterprise-ready chatbot using OpenAI GPTs and integrating it with custom APIs like a Wardley Map generator presents a powerful combination. By paying close attention to implementation details, API integration, architecture, security, privacy, scalability, and reliability, businesses can develop chatbots that not only converse naturally but also offer advanced functionalities, driving innovation and efficiency in various enterprise applications.

--

--

Mark Craddock
Prompt Engineering

Techie. Built VH1, G-Cloud, Unified Patent Court, UN Global Platform. Saved UK Economy £12Bn. Now building AI stuff #datascout #promptengineer #MLOps #DataOps