Sitemap
TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Generative Q&A With GPT 3.5 and Long-Term Memory

Exploring the new world of retrieval-augmented ML

6 min readFeb 17, 2023

--

Press enter or click to view image in full size
Photo by Bret Kavanaugh on Unsplash. Originally published at pinecone.io, where the author is employed.

Generative AI sparked several “wow” moments in 2022. From generative art tools like OpenAI’s DALL-E 2, Midjourney, and Stable Diffusion, to the next generation of Large Language Models like OpenAI’s GPT-3.5 generation models, BLOOM, and chatbots like LaMDA and ChatGPT.

It’s hardly surprising that Generative AI is experiencing a boom in interest and innovation [1]. Yet, this marks just the first year of widespread adoption of generative AI: The early days of a new field poised to disrupt how we interact with machines.

One of the most thought-provoking use cases belongs to Generative Question- Answering (GQA). Using GQA, we can sculpt human-like interaction with machines for information retrieval (IR).

We all use IR systems every day. Google search indexes the web and retrieves relevant information to your search terms. Netflix uses your behavior and history on the platform to recommend new TV shows and movies, and Amazon does the same with products [2].

These applications of IR are world-changing. Yet, they may be little more than a faint echo of what we will see in the coming months and years with the combination of IR and GQA.

--

--

TDS Archive
TDS Archive

Published in TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

James Briggs
James Briggs

Written by James Briggs

Freelance ML engineer learning and writing about everything. I post a lot on YT https://www.youtube.com/c/jamesbriggs

No responses yet