How to Restrict GPT-3.5 Turbo Search Base

Syed Khizar Rayaz
2 min readFeb 1, 2024

--

Open AI

Assumptions:

While writing this, I presume you already have your Openai API key and Assistant ID. If you want to know how to get one, comment down below.

Basics to Know:

Before that I want you to know the basics of open AI API as of the January 2024 update:

There are three types of roles while calling the API:

  1. System: helps you direct the model to the nature of the response you want to get.
  2. User: this is the query you ask from the model
  3. Assistant: this is the response you get from the model.

Steps to Follow:

  • Initialize the thread if you are working with continuous back-and-forth interaction with the model:
thread = openai.beta.threads.create()
  • Then call the message create method while passing the thread ID just created, role, and content which will be the prompt:
openai.beta.threads.messages.create(
thread_id=st.session_state.thread_id,
role="user",
content=prompt
)
  • Then again call the message method, to pass on the previous response or you can say the context/Search Base of the model, in this case, take some pdf_txt, make sure to call all the following methods below under the condition where you have already passed/uploaded your search base:
openai.beta.threads.messages.create(
thread_id=st.session_state.thread_id,
role="user",
content=pdf_text
)
  • After that call run. create a method, while passing the thread ID and assistant ID and any optional instructions you want to pass on to the model, to initiate the response from the model and collect it using runs. retrieve method, passing the parameters namely: thread ID, and run ID you get just before this method, while this method completes its execution:
run = openai.beta.threads.runs.create(
hread_id=st.session_state.thread_id,
assistant_id=assistant_id,
instructions="Please answer the queries using the knowledge provided in the files. When adding other information mark it clearly as such.",

)

while run.status != 'completed':
time.sleep(1)
run = openai.beta.threads.runs.retrieve(
thread_id=st.session_state.thread_id,
run_id=run.id
)
  • Finally, get the responses from the model using messages. list method passing Thread ID along with a loop to get the individual response in conjunction with every query you passed on to the model:
messages = openai.beta.threads.messages.list(
thread_id=st.session_state.thread_id
)

assistant_messages_for_run = [
message for message in messages
if message.run_id == run.id and message.role == "assistant"
]
for message in assistant_messages_for_run:
if message.content and hasattr(message.content[0].text, 'value'):
response_text = message.content[0].text.value
  • In the end, you can also return ‘nothing was found against the query from the provided context or search base’ upon checking the above condition in disjunction.

--

--