Member-only story
Learning Generative AI
Connecting to MCP on a Local LLM Programmatically
Experiment Model Context Protocols (MCP) with a Local LLM without Limits
Previously I shared on how we can link up MCP with a Local LLM, using a 3rd party library PraisonAI
The above is great having all setup, but it as it does all the behind the scene for us, we have less control of the MCP functional call and output.
Here, in this article, I’ll share how we can have better control of the MCP function call and output, as we develop our application liaise with LLM and the MCP.
Setup the MCP
First of all, we want to get the MCP running. For our case, let’s get the AirBnb MCP, provided by OpenBnb (you can find more MCP references here)