How To Use New OpenAI Embeddings Model with LangChain

Sudarshan Koirala
2 min readJan 31, 2024

--

Use the new OpenAI embeddings model with LangChain

👨🏾‍💻 GitHub ⭐️| 🐦 Twitter | 📹 YouTube | 👔LinkedIn | ☕️Ko-fi

Image by Author

OpenAI recently made an announcement about the new embedding models and API updates. This is what they have to say about it, for more info have a look at the announcement.

We are releasing new models, reducing prices for GPT-3.5 Turbo, and introducing new ways for developers to manage API keys and understand API usage. The new models include:

  • Two new embedding models
  • An updated GPT-4 Turbo preview model
  • An updated GPT-3.5 Turbo model
  • An updated text moderation model

This post from Peter Gostev on LinkedIn shows the API cost of GPT 3.5 and embeddings model in figure, easier for our eyes. Thanks Peter Gostev.

Image from Peter Gostev’s LinkedIn post (link above this figure)

You can implement this with the default OpenAI way by following OpenAI’s documentation but LangChain integrated to make our life even easier ( good part of using framework). I don’t want to copy paste all the code for you here, instead here is the GitHub code where I have implemented ( well taken from LangChain and added some additional stuffs on top it ). I have also shown how to implement this without LangChain using openai python package.

Now, you know how to implement new openai embeddings model with and without LangChain. If you prefer a video walkthrough, here is the link.

👨🏾‍💻 GitHub ⭐️| 🐦 Twitter | 📹 YouTube | 👔LinkedIn | ☕️Ko-fi

Recommended YouTube playlists:

  1. LangChain-Framework-Build-Around-LLMs
  2. 30 Days of Databricks
  3. LlamaIndex Playlist

Thank you for your time in reading this post!

Make sure to leave your feedback and comments. See you in the next blog, stay tuned 📢

--

--