Stream OpenAI respond through FastAPI to Next.js 14— Better Version

Tim Nirmal
4 min readAug 20, 2024

In ChatGPT have you seen how seen how messages are steam through as they are generated. So you don’t have to wait until the LLM to generate whole respond before you receive it.

This is can be easily done by the API provided with OpenAI. But what if we need to use a backend and stream that respond through it.

For this we use below technologies,

And this is New better version of my previous post, so if you need to know more details about this streaming and backend please follow that. Only The frontend has changed in this new version.

And If you have any question feel free to contact me at timnirmal.com or timnirmal@gmail.com

Setup Next.js

Create a Next.js project. In this I have used nextjs 14, tailwindcss and typescript. I have run this on port 4000. Make sure to set the backend API URL/port correctly. (in here its 8000)

--

--