Build your own Chat GPT Plugin

Dmytro Sazonov
𝐀𝐈 𝐦𝐨𝐧𝐤𝐬.𝐢𝐨
8 min readAug 21, 2023

This article is answering the question, — How to build your own Chat GPT Plugin using Python. You don’t need anything else besides this comprehensive guide and Visual Studio Code

Photo by Mojahid Mottakin on Unsplash

A few months ago when I was researching Chat GPT Plugin development it was not for free to use custom plugins and develop something for the platform. But now, with their last update we can entirely develop our own plugins with no cost, absolutely for free.

In the previous article “Build your own conversational chatbot using Chat GPT API” I promised to consider Chat GPT Plugin development and this is it. In this article we are gonna experiment with Chat GPT customization using its own development concept — Plugins.

Why do we need to develop plugin?

Chat GPT has its limitations and restrictions. According to them, it can’t even browse the Internet to search an answer. When you ask AI — it’s trying to find the answer using its own huge knowledge base which is limited up to 2021 year of public data. What about the data which is after that year or the data which is not located in the internet but available within the custom application?

There are a lot of Plugin developers nowadays who already deployed their own plugins for their own applications. Now it’s the time you do so as well.

Prerequisites

For our experiment you will need free plan subscription in the Chat GPT (https://chat.openai.com/) and the Tab “Alpha” as shown on the picture below.

tab “Alpha”, option “Plugins” should be enabled

As you may see I don’t have paid subscription and even don’t use GPT 4.0. So you don’t need to pay for using custom plugins or developing your own any more. This is kind of good news for all developers.

Tools and libraries

Besides of the fact the Open AI offers to use Python for Plugin-development in their official Documentation, you are free to choose any language and tool for developing your own plugin. Just remember that its REST depended (OpenAPI is well-known as Swagger). I also have developed analogue C# example, the GitHub-link which you can find in the last section of this article. However, for the purposes of our experiment I am willing to use:

  • Visual Studio Code
  • Python
  • Quart microframework

Quart is an asyncio reimplementation of the popular Flask microframework API. This means that if you understand Flask you understand Quart.

And Open AI suggests developers to use this framework as well.

Let’s start

When I was thinking what to automate in the Chat GPT Plugin, I had a lot of ideas. But finally I picked the easiest one, — weather forecasting plugin.

Weather forecasting Chat GPT Plugin schema

On the picture above you can see the general architecture for the service. This is the easiest one. At first the customer types the intent in the UI of Chat GPT with Plugin enabled. Something like “What is the weather in Boulder?”. The Chat GPT’s engine understands that the intent is about the weather and calls an appropriate Plugin which is most relevant in this case. Plugin itself calls the Weather-forecasting service (https://wttr.in), parses the result and returns the answer to Chat GPT in its own format. Chat GPT receives the answer and displays it to the user in its own way within the Chat GPT UI.

To make it true let’s run the Visual Studio Code.

Requirements

We have to create our first file ‘requirements.txt’ and place inside it the following code.

quart
quart-cors

We need to install these libraries then using the following command:

pip install -r requirements.txt

Logo

Our second file is ‘logo.png’ which we have to place in our working directory next to ‘requirements.txt’. Try to make it transparent and in high resolution. In our case I used 512x512px.

openapi.yaml

File with service-endpoint definition is ‘openapi.yaml’. This is an OpenAPI definition. The Model in ChatGPT knows anything about our API thanks to this OpenAPI YAML-specification.

openapi: 3.0.1
info:
title: Weather forecasting Plugin
description: A plugin that allows the user to forecast the Weather conditions using ChatGPT.
version: 'v1'
servers:
- url: http://localhost:5008
paths:
/weather-forecast/{city}:
get:
operationId: getWeather
summary: Get the weather forecast
parameters:
- in: path
name: city
schema:
type: string
required: true
description: The city for the weather forecasting.
responses:
"200":
description: OK
content:
application/json:
schema:
$ref: '#/components/schemas/getWeatherResponse'

components:
schemas:
getWeatherResponse:
type: object
properties:
weather:
type: string
description: The weather forecast.

Here we are describing our service endpoints to be used by Chat GPT model and some general information which will be displayed in Chat GPT UI during the enabling process.

As you may see ‘servers: — url: http://localhost:5008’ declares that we are developing our service and will be using them locally in debug mode.

Section ‘info:’ includes title, description which wee will be able to see during the setup of the Plugin. Section ‘paths:’ defines our service-endpoints. In our case we have just one ‘GET /weather-forecast/{city}’ with only parameter ‘{city}’ which refers to just one ‘schema’ in our file as a response from the service ‘components:schemas:getWeatherResponse’. From this response-definition we understand that as the answer from our service we will always have just one string with weather forecast.

well-known\ai-plugin.json

This is Plugin manifest and it should be placed inside the folder ‘well-known’. Chat GPT is looking for this file during the Plugin installation process. It needs to be discovered to let the model understand which purposes it can be used for.

{
"schema_version": "v1",
"name_for_human": "Weather forecast",
"name_for_model": "weather_forecast",
"description_for_human": "Plugin for forecasting the Weather conditions.",
"description_for_model": "Plugin for forecasting the Weather conditions.",
"auth": {
"type": "none"
},
"api": {
"type": "openapi",
"url": "http://localhost:5008/openapi.yaml",
"is_user_authenticated": false
},
"logo_url": "http://localhost:5008/logo.png",
"contact_email": "misterd793@gmail.com",
"legal_info_url": "https://medium.com/@dmytrosazonov"
}

As you may noticed, here we have duplicated properties for the model. They are being used by Chat GPT Model to discover the Plugin and its purposes. Also, here are the links to YAML-definition and Logo.png. You can find more details regarding this file in the official documentation. But I believe you don’t need to do that.

main.py

Our major file includes all the logic for our Plugin-service. As you may see in the beginning we import libraries ‘json’ and ‘requests’ as well as ‘quart’ and ‘quart_cors’. Then we are enabling CORS (Cross-Origin Resource Sharing) letting Chat GPT (https://chat.openai.com) call our Plugin.

import json
import requests

import quart
import quart_cors
from quart import request

app = quart_cors.cors(quart.Quart(__name__),
allow_origin="https://chat.openai.com")

WEATHER_API_URL = 'https://wttr.in/{}'

@app.get("/weather-forecast/<string:city>")
async def get_weather(city):
url = WEATHER_API_URL.format(city)
try:
data = requests.get(url)
R = data.text
except:
R = "The weather forecast for the city: {} is not available yet.".format(city)
return quart.Response(response=json.dumps(R), status=200)

@app.get("/logo.png")
async def plugin_logo():
filename = 'logo.png'
return await quart.send_file(filename, mimetype='image/png')

@app.get("/.well-known/ai-plugin.json")
async def plugin_manifest():
host = request.headers['Host']
with open("./.well-known/ai-plugin.json") as f:
text = f.read()
return quart.Response(text, mimetype="text/json")

@app.get("/openapi.yaml")
async def openapi_spec():
host = request.headers['Host']
with open("openapi.yaml") as f:
text = f.read()
return quart.Response(text, mimetype="text/yaml")

def main():
app.run(debug=True, host="0.0.0.0", port=5008)

if __name__ == "__main__":
main()

The variable WEATHER_API_URL refers to the web-service we are using for weather forecasting.

In the following code we are describing our first and last endpoint ‘@app.get(“/weather-forecast/<string:city>”)’ with just one parameter in HTTP GET request <string:city>. It means that Chat GPT will need to identify city to be able to use our service. Inside this code we just make a call to the Weather-forecasting service, receiving the response, extracting the data and answering to Chat GPT UI with the retrieved forecast.

Then we just make different files discoverable inside the application: logo.png, ai-plugin.json and openapi.yaml. You can observe that in the appropriate sections.

And finally we define the Main-function which is intended to run the application on the well-known port 5008.

Run

You are ready for service run. In your Visual Studio Code you just need to use next command:

python main.py

If everything is ok and you installed all the dependencies before, — you will see something like the following

Running Chat GPT Plugin for Weather forecasting

Now its time to try it in Chat GPT UI.

Alpha / Plugin store
Develop your own plugin
Find manifest file
Click button ‘Install localhost plugin’
Weather forecast Plugin is enabled

When its enabled you can see blue checkbox on the front of the Plugin title in the Chat GPT UI as shown on the picture above.

Usage

We just have done all the development and installation process and its time to test the Plugin in the real environment. Let’s do that.

Weather forecast for Kyiv city, Ukraine

As you may see on the picture above, we just asked Chat GPT to provide us with weather for Kyiv city and Chat GPT displayed it using our Plugin-service.

You might need to research how it works and to experiment with it on your end. Don’t hesitate to download code using the instruction in the last section of the article.

What is next?

While experimenting with the ChatGPT API and Plugins, I realized how beneficial it would be if ChatGPT could directly interact with the private data within my application, like querying my own database. Guess what? I’ve found a way to make it happen. I’m willing to share that in the next article. Stay tuned!

If you’re eager to delve deeper

Visit Github to download code and start experimenting on your end. Don’t hesitate to reach me out on Twitter in case you have any questions.

Python example: https://github.com/under0tech/plugin-weather

C# example: https://github.com/under0tech/WeatherPlugin

Twitter: https://twitter.com/dmytro_sazonov

--

--

Dmytro Sazonov
𝐀𝐈 𝐦𝐨𝐧𝐤𝐬.𝐢𝐨

Blockchain enthusiast and artificial intelligence researcher. I believe with these tools we will build better tomorrow :)