Creating Impact: A Spotlight on 6 Practical Retrieval Augmented Generation Use Cases

Abhinav Kimothi
3 min readDec 4, 2023

In 2023, RAG has become one of the most used technique in the domain of Large Language Models. In fact, one can assume that no LLM powered application doesn’t use RAG in one way or the other. Here are 6 use cases that RAG forms a pivotal part of.

If you’re interested in finding out more about retrieval augmented generation, do give my blog a read-

Document Question Answering Systems

By providing access to proprietary enterprise document to an LLM, the responses are limited to what is provided within them. A retriever can search for the most relevant documents and provide the information to the LLM. Check out this blog for an example —

Conversational agents

LLMs can be customised to product/service manuals, domain knowledge, guidelines, etc. using RAG. The agent can also route users to more specialised agents…

--

--

Abhinav Kimothi

Co-founder and Head of AI @ Yarnit.app || Data Science, Analytics & AIML since 2007 || BITS-Pilani, ISB-Hyderabad || Ex-HSBC, Ex-Genpact, Ex-LTI || Theatre