Make Your Personal GPT-3.5 Turbo Application!

A hands-on, step-by-step guide to building your own GPT-3.5 application with Python.

Aarafat Islam
11 min readApr 19, 2023
Photo from Unsplash

“Large language models are transforming the way we process and understand language.” — Bryan McCann

ChatGPT is a large-scale language model developed by OpenAI, based on the GPT-3 architecture. It is capable of generating human-like text in response to a given prompt or question. The model was trained on a massive dataset of over 45 terabytes of text from a variety of sources, including books, articles, and websites. The parameters of ChatGPT include 175 billion parameters, which makes it one of the largest language models available to date. Its training data is diverse, covering a wide range of topics and genres, which allows it to generate coherent and informative responses to a wide variety of prompts. ChatGPT has a number of features that make it a powerful tool for natural language processing tasks. It can answer questions, complete sentences, and even generate entire paragraphs of text. It can also translate text between languages, summarize long documents, and generate text in a specific style or tone.

“Personal GPT-3.5 Turbo Application”- project utilizes OpenAI’s GPT-3.5 Turbo model to enable users to create their own custom virtual assistants and chatbots with impressive natural language processing capabilities. By leveraging the power of deep learning and machine learning algorithms, this application allows users to generate natural language responses to user input, making it an incredibly powerful tool for a wide range of applications.

In this article, I will provide a detailed explanation of the “Personal GPT-3.5 Turbo Application” code and how it works. I will cover each section of the code, including the OpenAI class, the search() function, and the GUI setup. By the end of this article, you will have a thorough understanding of the power of ChatGPT as well as the ability to create your own personal GPT application.

Necessary Libraries:

import openai, pyperclip, os
from rich import print
import threading
import tkinter as tk
from tkinter import ttk

Why these libraries are imported:

  • openai: OpenAI library is a Python library that provides an interface for working with OpenAI’s services and models. It includes natural language processing (NLP) models, such as GPT-3 (Generative Pre-trained Transformer 3) as well as tools for working with other AI tasks, such as computer vision and robotics.
  • pyperclip: This library is used for copying and pasting text to and from the clipboard. It provides a simple way to access the clipboard from within a Python program and allows you to manipulate the text stored there.
  • os: This library provides a way to interact with the operating system. It allows you to perform a variety of system-level tasks, such as accessing files and directories, creating and deleting processes, and managing environment variables.
  • rich: This library provides advanced formatting and styling options for console output. It allows you to add color, bold and italic text, and other styles to your console output, making it more readable and attractive.
  • threading: This library is used for creating and managing threads in a program. It allows you to run multiple tasks concurrently and can be used to improve the performance of your program or to perform long-running tasks without freezing the user interface.
  • tkinter: This library provides a simple and easy-to-use interface for creating graphical user interfaces (GUIs) in Python. It is built on top of the popular Tk GUI toolkit and allows you to create windows, buttons, input fields, and other interactive elements in your program.
  • ttk: This module provides additional widgets and styles for the tkinter library. It includes more advanced widgets, such as progress bars, tree views, and notebook tabs, and provides a more consistent look and feel across different operating systems.

Set API Key:


API_KEY = 'paste your api key'

Explanation of this code:

  • API_KEY: This is a variable that stores an API key needed to access the OpenAI API. An API key is a unique identifier that allows a user to access a specific set of resources or services on a web-based platform.
  • https://platform.openai.com/account/api-keys: This is the URL that leads to the page on the OpenAI platform where you can generate an API key. The user needs to sign in to their account on the platform to access this page. Once they generate an API key, they can copy and paste it into the code in place of the placeholder text. This allows them to authenticate their requests to the OpenAI API and access the services provided by the platform.

Create OpenAI Class:

class OpenAI:
def __init__(self, api_key) -> None:
openai.api_key = api_key
self.response = ''
self.prompt = ''

def generate_response(self, prompt: str) -> str:
self.prompt = prompt
completion = openai.ChatCompletion.create(model='gpt-3.5-turbo', messages=[{"role": "user", "content": prompt}])
self.response = completion.choices[0].message.content
return self.response

def handle_exception(self, error: str) -> str:
if isinstance(error, openai.error.OpenAIError):
print(f'[bold red] Response had a error: {error}[/bold red]')
elif isinstance(error, IndexError):
print(f'[bold red] Response had a error: no response generated[/bold red]')
elif isinstance(error, Exception):
print(f'[bold red]Error: {error}[/bold red]')
else:
print(error)

Explanation of this code:

The OpenAI class is used to initialize an instance of the OpenAI API with the user's API key. It provides two methods: generate_response() for sending a user prompt to the API and receiving a response, and handle_exception() for handling errors that may occur during the API call.

  • class OpenAI: Defines a new class called OpenAI. This class is used to create an interface for working with the OpenAI API in a more organized and modular way.
  • def __init__(self, api_key) -> None: Defines a constructor method for the OpenAI class that takes an API key as input and sets up instance variables for the response and prompt.
  • openai.api_key = api_key: Sets the OpenAI API key for the current instance of the class.
  • def generate_response(self, prompt: str) -> str: Defines a method for generating a response from the OpenAI API, taking a prompt string as input and returning a response string.
  • completion = openai.ChatCompletion.create(model='gpt-3.5-turbo', messages=[{"role": "user", "content": prompt}]): Sends the prompt to the OpenAI API and stores the response in a variable called completion.
  • self.response = completion.choices[0].message.content: Extracts the response from the completion object and stores it in the instance variable self.response.
  • def handle_exception(self, error: str) -> str:: Defines a method for handling exceptions that may occur during the OpenAI API call.
  • print(f'[bold red] Response had a error: {error}[/bold red]'): Prints an error message with rich text formatting to the console, depending on the type of error that occurred.

Create Search Function:

def search(api, entry, text, prompt_label):
prompt = entry.get()

# Create the progress bar widget in the same frame as the search bar
progress = ttk.Progressbar(top_frame, mode='indeterminate', length=170)
progress.place(relx=0.5, rely=1.4, anchor=tk.CENTER)
progress.start()

def generate_response_thread():
try:
response = api.generate_response(prompt)
root.after(0, update_response, response)
except Exception as error:
root.after(0, api.handle_exception, error)

def update_response(response):
text.delete(1.0, tk.END)
text.insert(tk.END, response)
pyperclip.copy(response)
entry.delete(0, tk.END) # Clear the search bar after a successful search
prompt_label.configure(text=f"Prompt: {api.prompt}")
progress.stop()
progress.place_forget()

threading.Thread(target=generate_response_thread).start()

Explanation of this code:

The search() function is called when the user clicks the "Search" button in the GUI. It retrieves the user's input from the search bar, sends it to the OpenAI API to generate a response, and then displays the response in the response text area. It also updates the prompt label with the user's original input, and provides error handling for any exceptions that may occur during the API call. Additionally, the function creates and displays a progress bar to indicate that the program is working on generating a response.

  • def search(api, entry, text, prompt_label): Defines a function called search that takes four parameters: api, entry, text, and prompt_label.
  • prompt = entry.get(): Retrieves the text entered into the entry widget (likely a text input field) and stores it in a variable called prompt.
  • progress = ttk.Progressbar(top_frame, mode='indeterminate', length=170): Creates a new progress bar widget using the ttk module and sets its mode to 'indeterminate'. The progress bar will be used to indicate that the program is working on generating a response to the user's input.
  • progress.place(relx=0.5, rely=1.4, anchor=tk.CENTER): Places the progress bar in the same frame as the search bar.
  • progress.start(): Starts the progress bar animation.
  • def generate_response_thread(): Defines a new function called generate_response_thread that will run in a separate thread.
  • response = api.generate_response(prompt): Sends the user's prompt to the OpenAI API using the generate_response method of the api object, and stores the resulting response in a variable called response.
  • root.after(0, update_response, response): Schedules the update_response function to be run as soon as possible, passing the response as an argument to the function.
  • except Exception as error: root.after(0, api.handle_exception, error): If an exception occurs during the API call, the handle_exception method of the api object is called to handle the exception and print an error message to the console.
  • def update_response(response): Defines a new function called update_response that will be called when the response has been generated.
  • text.delete(1.0, tk.END): Deletes any existing text in the text widget (output field).
  • text.insert(tk.END, response): Inserts the generated response into the text widget.
  • pyperclip.copy(response): Copies the response to the clipboard.
  • entry.delete(0, tk.END): Clears the entry widget after a successful search.
  • prompt_label.configure(text=f"Prompt: {api.prompt}"): Sets the text of the prompt_label widget to display the original prompt.
  • progress.stop(): Stops the progress bar animation.
  • progress.place_forget(): Removes the progress bar widget from the GUI.
  • threading.Thread(target=generate_response_thread).start(): Starts a new thread to run the generate_response_thread function, allowing the program to continue running while the response is generated.

Create GUI:

if __name__ == "__main__":
api = OpenAI(API_KEY)

root = tk.Tk()
root.title("Personal GPT-3.5 Turbo Application")
root.resizable(False, False)

top_frame = ttk.Frame(root, padding=10)
top_frame.grid(row=0, column=0, sticky=tk.W)
middle_frame = ttk.Frame(root, padding=10)
middle_frame.grid(row=1, column=0, sticky=tk.W)
bottom_frame = ttk.Frame(root, padding=10)
bottom_frame.grid(row=2, column=0, sticky=tk.W)

entry_label = ttk.Label(top_frame, text="Search:")
entry = ttk.Entry(top_frame, width=40)
search_button = ttk.Button(top_frame, text="Search", command=lambda: search(api, entry, response_text, prompt_label))
prompt_label = ttk.Label(middle_frame, text="")
response_label = ttk.Label(bottom_frame, text="Response:")
response_text = tk.Text(bottom_frame, wrap=tk.WORD, width=50, height=15,bg='light blue')


entry_label.grid(row=0, column=0, sticky=tk.W)
entry.grid(row=0, column=1, sticky=tk.W)
search_button.grid(row=0, column=2, sticky=tk.W)
prompt_label.grid(row=0, column=0, sticky=tk.W)
response_label.grid(row=0, column=0, sticky=tk.W)
response_text.grid(row=1, column=0, columnspan=3, sticky=tk.W)

root.mainloop()

Explanation of this code:

This code sets up a graphical user interface (GUI) for an application that utilizes OpenAI’s GPT-3.5 Turbo model. It creates a Tkinter window and several frames to organize widgets. It also creates a search bar, a search button, and a text area to display the response generated by the OpenAI API. When the user enters a search prompt and clicks the search button, the search() function is called to send the prompt to the OpenAI API and display the resulting response in the text area. The mainloop() method is used to start the event loop that displays the window and waits for user input.

  • if __name__ == "__main__": Runs the following code only if this script is being run directly and not imported as a module.
  • api = OpenAI(API_KEY): Initializes an instance of the OpenAI class using the API_KEY specified earlier.
  • root = tk.Tk(): Creates a new instance of a Tkinter window.
  • root.title("Personal GPT-3.5 Turbo Application"): Sets the title of the window.
  • root.resizable(False, False): Disables the ability to resize the window.
  • top_frame = ttk.Frame(root, padding=10): Creates a new frame that will be positioned at the top of the window.
  • middle_frame = ttk.Frame(root, padding=10): Creates a new frame that will be positioned in the middle of the window.
  • bottom_frame = ttk.Frame(root, padding=10): Creates a new frame that will be positioned at the bottom of the window.
  • entry_label = ttk.Label(top_frame, text="Search:"): Creates a new label for the search field.
  • entry = ttk.Entry(top_frame, width=40): Creates a new entry widget for the user to input their search prompt.
  • search_button = ttk.Button(top_frame, text="Search", command=lambda: search(api, entry, response_text, prompt_label)): Creates a new button that calls the search function when clicked.
  • prompt_label = ttk.Label(middle_frame, text=""): Creates a new label to display the original search prompt.
  • response_label = ttk.Label(bottom_frame, text="Response:"): Creates a new label for the response output field.
  • response_text = tk.Text(bottom_frame, wrap=tk.WORD, width=50, height=15,bg='light blue'): Creates a new text widget to display the response generated by the OpenAI API.
  • root.mainloop(): Starts the event loop for the window to display the widgets and listen for user input.

Output:

Prompt and response is generating
Response

Features of the program:

  • The program uses OpenAI’s ChatGPT model for generating responses to user prompts.
  • It has a GUI with a search bar and a response section.
  • The program automatically clears the search bar after a successful search.
  • The program uses a progress bar to indicate that a response is being generated.
  • The program displays the prompt that was searched in the prompt section of the GUI.
  • The response section of the GUI displays the generated response from the ChatGPT model.
  • The program copies the generated response to the clipboard automatically after displaying it in the response section of the GUI.
  • The program is designed to handle errors that may occur during response generation and display appropriate error messages to the user.

Complete Code:

Github link: https://github.com/aarafat27/Personal-GPT-3.5-Application

# import all the necessary libraries
import openai, pyperclip, os
from rich import print
import threading
import tkinter as tk
from tkinter import ttk
# you can get your API_KEY from this link: https://platform.openai.com/account/api-keys

API_KEY = 'paste your api key'

class OpenAI:
def __init__(self, api_key) -> None:
openai.api_key = api_key
self.response = ''
self.prompt = ''

def generate_response(self, prompt: str) -> str:
self.prompt = prompt
completion = openai.ChatCompletion.create(model='gpt-3.5-turbo', messages=[{"role": "user", "content": prompt}])
self.response = completion.choices[0].message.content
return self.response

def handle_exception(self, error: str) -> str:
if isinstance(error, openai.error.OpenAIError):
print(f'[bold red] Response had a error: {error}[/bold red]')
elif isinstance(error, IndexError):
print(f'[bold red] Response had a error: no response generated[/bold red]')
elif isinstance(error, Exception):
print(f'[bold red]Error: {error}[/bold red]')
else:
print(error)

def search(api, entry, text, prompt_label):
prompt = entry.get()

# Create the progress bar widget in the same frame as the search bar
progress = ttk.Progressbar(top_frame, mode='indeterminate', length=170)
progress.place(relx=0.5, rely=1.4, anchor=tk.CENTER)
progress.start()

def generate_response_thread():
try:
response = api.generate_response(prompt)
root.after(0, update_response, response)
except Exception as error:
root.after(0, api.handle_exception, error)

def update_response(response):
text.delete(1.0, tk.END)
text.insert(tk.END, response)
pyperclip.copy(response)
entry.delete(0, tk.END) # Clear the search bar after a successful search
prompt_label.configure(text=f"Prompt: {api.prompt}")
progress.stop()
progress.place_forget()

threading.Thread(target=generate_response_thread).start()
if __name__ == "__main__":
api = OpenAI(API_KEY)

root = tk.Tk()
root.title("Personal GPT-3.5 Turbo Application")
root.resizable(False, False)

top_frame = ttk.Frame(root, padding=10)
top_frame.grid(row=0, column=0, sticky=tk.W)
middle_frame = ttk.Frame(root, padding=10)
middle_frame.grid(row=1, column=0, sticky=tk.W)
bottom_frame = ttk.Frame(root, padding=10)
bottom_frame.grid(row=2, column=0, sticky=tk.W)

entry_label = ttk.Label(top_frame, text="Search:")
entry = ttk.Entry(top_frame, width=40)
search_button = ttk.Button(top_frame, text="Search", command=lambda: search(api, entry, response_text, prompt_label))
prompt_label = ttk.Label(middle_frame, text="")
response_label = ttk.Label(bottom_frame, text="Response:")
response_text = tk.Text(bottom_frame, wrap=tk.WORD, width=50, height=15,bg='light blue')


entry_label.grid(row=0, column=0, sticky=tk.W)
entry.grid(row=0, column=1, sticky=tk.W)
search_button.grid(row=0, column=2, sticky=tk.W)
prompt_label.grid(row=0, column=0, sticky=tk.W)
response_label.grid(row=0, column=0, sticky=tk.W)
response_text.grid(row=1, column=0, columnspan=3, sticky=tk.W)

root.mainloop()

In conclusion, the “Personal GPT-3.5 Turbo Application” represents a powerful example of how OpenAI and ChatGPT can be harnessed to create custom virtual assistants and chatbots with impressive natural language processing capabilities. By leveraging the power of deep learning and machine learning algorithms, this application enables users to generate natural language responses to user input, making it an incredibly powerful tool for a wide range of applications. With the ability to customize the application to fit specific use cases, the potential for this technology is virtually limitless. As language continues to play an increasingly important role in our digital lives, the impact of ChatGPT and OpenAI is likely to continue to grow, paving the way for a future where humans and machines can communicate with one another more seamlessly than ever before.

--

--

Aarafat Islam

🌎 A Philomath | Predilection for AI, DL | Blockchain Researcher | Technophile | Quick Learner | True Optimist | Endeavors to make impact on the world! ✨