Member-only story
Featured
Building an Agentic RAG with LanceDB, Anthropic’s Model Context Protocol (MCP), Bedrock, and Ollama in Google Colab
Exploring what it means to be minimally ‘agentic’ using the MCP, a vector database & both cloud and locally-hosted LLMs to navigate and query documents intelligently.
recently wrote a great piece about combining Anthropic’s Model Context Protocol, Claude & LanceDB to create a better than average ‘true agentic’ RAG solution. The author utilizes a Lance database with pair of tables containing summary information and embeddings for some health care related PDFs. Adding new documents and updating the index is made easy with a ‘seed’ function that can overwrite or retain the original data using a hash that is stored as metadata. Finally, Claude accesses this database through an lance-mcp
server he wrote and shared on GitHub along with the rest of the example [ link ].
Surprisingly, the author shows how this relatively simple solution goes beyond typical Retrieval Augmented Generation (RAG), since Claude will leverage the information about the MCP Server’s tools along with the initial prompt to ‘figure out’ the best way to search and formulate prompts. This led to more accurate and relevant results than expected without having to do much of anything to guide or pre-determine the process.