Sitemap
Tech AI Chat

Chat on Technology Archive and Insights

Member-only story

Learning Generative AI

Connect Function Call With Your Local LLM

Get your Local LLM to Perform Specific Function of Your Desire

13 min readJun 3, 2025

--

Photo by Joan Gamell on Unsplash

LLM is great for prompting and giving verbal answers. But if we want to connect it to do something specific, wouldn’t it nice if we can ask it to call a function of our desire?

Here, I’m going to show you how we can achieve that using Ollama (the Local LLM, that I previously shared how to install it in

Pre-requisite

To continue, firstly you need to download the Ollama locally as shared above.

For this, we want to use llama3.1 and or llama3.2 model. The llama3 don’t have the needed tool functionality support, while llama3.3 is too huge to run on Local MacBook Pro efficiently. Besides llama3.3 is > 40GB large model, while the 3.2 and 3.1 model are both under 5GB.

--

--

Elye - A One Eye Dev By His Grace
Elye - A One Eye Dev By His Grace

Written by Elye - A One Eye Dev By His Grace

Sharing Software, Life and Faith Journey. Follow me on Twitter/X to access to my article free

Responses (1)