Member-only story

Generative AI - Document Retrieval and Question Answering with LLMs

Apply LLMs to your domain-specific data

Sascha Heyer
Google Cloud - Community
7 min readJun 5, 2023

--

With Large Language Models (LLMs), we can integrate domain-specific data to answer questions. This is especially useful for data unavailable to the model during its initial training, like a company's internal documentation or knowledge base.

The architecture is called Retrieval Augmentation Generation or less commonly used Generative Question Answering.

This article helps you understand how to implement this architecture using LLMs and a Vector Database. We can significantly decrease the hallucinations that are commonly associated with LLMs.

It can be used for a wide range of use cases. It reduces the time we need to interact with documents. There is no need for us anymore to search for answers in search results. The LLM takes care of precisely finding the most relevant documents and using them to generate the answer right from your documents.

Jump to the Notebook and Code

All the code for this article is ready to use in a Google Colab notebook. If you have questions, please reach out to me via LinkedIn or Twitter.

--

--

Google Cloud - Community
Google Cloud - Community

Published in Google Cloud - Community

A collection of technical articles and blogs published or curated by Google Cloud Developer Advocates. The views expressed are those of the authors and don't necessarily reflect those of Google.

Sascha Heyer
Sascha Heyer

Written by Sascha Heyer

Hi, I am Sascha, Senior Machine Learning Engineer at @DoiT. Support me by becoming a Medium member 🙏 bit.ly/sascha-support

Responses (12)