Member-only story
Generative AI + RAG Could Boost Productivity and Political Polarization
The Promises and Perils of Custom LLMs
Retrieval Augmented Generation, or RAG, should reduce AI “hallucinations.” More reliable Generative AI will increase productivity, but it could also exacerbate cultural and political polarization.
RAG is a pretty simple concept: create a specialized database from which an AI draws to generate output. It’s basically a custom Large Language Model or LLM. Whereas ChatGPT contains the entire World Wide Web circa 2021, an RAG database would consist of a much smaller and more carefully curated dataset.
For example, company could upload all of its HR documents to an RAG, and employees could interact with a chatbot custom-built to respond to queries only with the information available in those hand-picked HR documents. Or a business could upload its standard operating procedures to an RAG, and a chatbot would help workers based solely on those SOPs.
AI researchers have found that RAGs reduce “hallucinations” substantially but do not completely eliminate them. Still, it doesn’t take an AI evangelist to see the potential in RAG for making Generative AI much more useful.