Create Custom-Tailored Search with ChatGPT

Angela Kunanbaeva
3 min readApr 5, 2023

--

Want to customize search with your company specific information and utilize ChatGPT’s natural language understanding capabilities? Here’s a quick way to do so. But to be clear we’re going to use Azure OpenAI services that provides REST API access to OpenAI’s powerful models including GPT-3 and Embeddings model series. Not only is this a great and easy way to dip your toes into experimenting with OpenAI models but also when your application is production ready, you already have access to Azure global infrastructure that meets critical security, compliance and regional availability requirements.

So first and foremost, you will need to request access to Azure OpenAI services here, wait till you get approved and remember your endpoint and API key (jk you don’t actually have to remember you can view them anytime)

Now that that is done, let’s consider why we might want a customized search function. There are countless use-cases, such as searching for specific tax or legal information, hotel databases, or as an assistant to an employee. The possibilities are endless! In this case, we’ll build a search function for a travel company Why? Because after going over several OpenAI guides I ended up with some travel information pdfs in my storage account, might as well put it to use.

One more thing, for search functionality we will use Azure Search from Azure cognitive services as it provides a good way to parse through various documentation types (PDFs, DOCs, Excel, etc) and allows semantic search.

But another way to search over your data is to use Embeddings model. This method is covered in this post.

Configure Azure Search

To configure Azure Search and create search index follow this quickstart. To put it simply, upload your files to Blob Storage, connect Azure search and create index and semantic configuration. That’s all, this builds up your own knowledge base that you can search over and provide customized response. You’ll need your endpoint and access key, and maybe jot all of those endpoints and keys in your .env while at it.

Deploy OpenAI models

If you want to follow Embeddings search post deploy embedding model, preferably, “text-embedding-ada-002”, to take advantage of more recent weights/updates. But we most certainly need to deploy good LLM model, if you have “gpt-35-turbo” or “text-davinci-003” language models, deploy either of those. Difference is significant they say, as “gpt-35-turbo” is designed for conversational interfaces while “text-davinci-003” is great for code-completion tasks. However, “text-davinci-003” should work perfectly fine for our search use-case, as we are not going to chat with it. When you get your hands on “gpt-35-turbo”, you can deploy this one.

Write search.py

First we want to load all necessary libraries and initiate azure search client

This way our when a user inputs the search prompt, we can search over our knowledge base and tailor our response. Because we search over the documents that might have plenty of information, what we want to do is set up a context for our ChatGPT response in a way that it will succinctly summarizes the result and publish the result in bulleted points.

Full code can be found in this repo, make sure to install all requirements. There are some notebooks that you can explore as well — quick and dirty but good enough.

Run Streamlit app

That’s it! Now you have your customized search bar. In your terminal where you have your search.py and environment variables, run command Streamlit run search.py, so that you can check the search bar locally, like so:

You can explore more in openai-cookbook. Let me know if this has been helpful and comment below what you’re building with ChatGPT. Thank you for your time!

--

--