Chat with your own data with LangChain, PGVector, custom cache in postgres using OpenAI — Part 1

Background

Kamlesh Belote
2 min readNov 7, 2023

To develop AI solution, I am searching for a framework that will simplify AI project development. I came across LangChain framework, which provides features, which we can use together to reduce response time and token consumption. We are using Python as the language for development.

We are going to use below things

  1. OpenAI : We as LLM
  2. LangChain : Framework to chain the prompt
  3. PGVector : Store the embedding and use it for searches
  4. Docker: Container for postgres database

In this part, we are going to create flask-api rest application with swagger.

We will use a Python virtual environment to manage our dependencies. To create the folder and set up the virtual environment, please use the following command:

mkdir langchain_cache

cd langchain_cache

python3 -m venv dev_env

After the virtual environment has been successfully created, activate it using the following command:

source dev_env/bin/activate

Once the virtual environment is activated, let’s start installing the packages required for the application.

pip3 install flask-restx

Now create folder with name app and inside app folder create main.py file.

mkdir app
cd app
touch main.py

Once you created the main.py file put below code in it.

from flask_restx import Api, Resource
from flask import Flask


def lang_chain_app():
app = Flask(__name__)
api = Api(app=app)

@api.route('/api/v1/')
class home(Resource):
def get(self):
return "Hello, World!"
return app

On the same level as the ‘app’ folder, create another file named ‘run.py.’ We will use this file to run a Flask-RESTx app

touch run.py

Add the following code to start the application. We will start the application in debug mode on port number 5005

from app.main import lang_chain_app


app = lang_chain_app()

if __name__ == "__main__":
app.run(
debug=True,
host='0.0.0.0',
port=5005
)

Now that everything is in place, let’s start our application and test the endpoint using the following command

 python3 run.py

After success application run

Swagger UI with default end point

--

--