Member-only story
Learning Generative AI
Connect Function Call With Your Local LLM
Get your Local LLM to Perform Specific Function of Your Desire
LLM is great for prompting and giving verbal answers. But if we want to connect it to do something specific, wouldn’t it nice if we can ask it to call a function of our desire?
Here, I’m going to show you how we can achieve that using Ollama (the Local LLM, that I previously shared how to install it in
Pre-requisite
To continue, firstly you need to download the Ollama locally as shared above.
For this, we want to use llama3.1 and or llama3.2 model. The llama3 don’t have the needed tool functionality support, while llama3.3 is too huge to run on Local MacBook Pro efficiently. Besides llama3.3 is > 40GB large model, while the 3.2 and 3.1 model are both under 5GB.