Unleashing LLMs: Functional Calling with LangChain, Ollama, and Microsoft’s Phi-3 (PART-2):

Anoop Maurya
5 min readMay 15, 2024

--

source-ollama

In the previous article, we explored Ollama, a powerful tool for running large language models (LLMs) locally. This article delves deeper, showcasing a practical application: implementing functional calling with LangChain, Ollama, and Microsoft’s Phi-3 model. This approach allows you to treat the LLM as a library of pre-defined functions, enabling you to call specific functionalities for a more structured and controlled interaction.

Unveiling the Tools:

  • LangChain (https://github.com/langchain-ai/langchain): A versatile framework designed to facilitate communication and collaboration between different AI models and tools. It acts as a bridge, allowing you to call functionalities exposed by one model within another. LangChain utilizes prompts to communicate with models and offers libraries for various functionalities.
  • Ollama (https://ollama.com/): As discussed before, Ollama simplifies running LLMs locally, making Phi-3 accessible for this project. Ollama provides user-friendly commands for managing and interacting with LLM models.
  • Microsoft’s Phi-3 (https://huggingface.co/microsoft/Phi-3-mini-128k-instruct): An open-source LLM known for its efficiency and capabilities in tasks like text generation, translation, and question answering. We’ll leverage Phi-3’s functionalities for demonstration purposes.

Setting the Stage:

  1. Install Ollama: Refer to the Ollama documentation (https://github.com/ollama/ollama) for the most up-to-date installation instructions specific to your operating system. Ensure you have the latest version for compatibility. Ollama installation might involve downloading binaries or using a package manager specific to your OS.
  2. Download Phi-3 Weights: Use the ollama pull command within your terminal to download the Phi-3 model weights. Here's an example:
ollama pull phi3

Note: This downloads the necessary files for running Phi-3 locally with Ollama.

3. Understanding Phi-3 Functionalities: While Phi-3 offers various functionalities like text summarization, translation, and question answering, we’ll focus on text summarization for this example.

Important Note: The specific functionalities exposed by an LLM depend on the model itself. Refer to Phi-3’s documentation for a comprehensive list of its capabilities to explore different potential applications.

Building the Functional Call with LangChain:

  • Importing Libraries:
from langchain_experimental.llms.ollama_functions import OllamaFunctions
  • Loading Model :
model = OllamaFunctions(
model="phi3",
keep_alive=-1,
format="json"
)

Note: If you face any error while creating model instance try- Upgrading langchain, langchain-experimental and other langchain packages.

  • Binding tools with llm :
model = model.bind_tools(
tools=[
{
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, " "e.g. San Francisco, CA",
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
},
},
"required": ["location"],
},
}
],
function_call={"name": "get_current_weather"},
)

  • Inference :
response = model.invoke("what is the weather in Singapore?")

Running the Functional Call:

  1. Execute the Python Script: Save the code snippet as a Python file (e.g., filename.py) and run it from your terminal using python file_name.py.
  2. Interact with the LLM: The script will prompt you for our usecase. Enter your text, and the script will call Phi-3 through Ollama and LangChain. Upon successful execution, it will return answer.

Expanding the Horizons:

This example demonstrates a basic functional call using LangChain, Ollama, and Phi-3. With this approach, you can explore various possibilities to enhance your LLM interactions:

  • Exposing More Functionalities: Explore other capabilities offered by Phi-3 or other LLMs you run with Ollama. Design LangChain prompts to call these functionalities. For instance, you could potentially use Phi-3 for sentiment analysis, question answering, or different text generation tasks.
  • Building Complex Workflows: Chain multiple functional calls together to create intricate workflows. For example, you could call a summarization function followed by a sentiment analysis function on the summary, or a translation function followed by a text generation function for creative writing tasks.
  • Customizing Prompts: Refine the LangChain prompt templates to tailor the interaction with the LLM and potentially influence its output.
  • Error Handling and Refinement: The provided code incorporates basic error handling using LangChain’s LLMException. Consider implementing more robust error handling mechanisms to gracefully handle potential issues during LLM interaction. Additionally, experiment with different LangChain functionalities and libraries to optimize your workflows.

Important Note: Remember that LLMs are still under development, and their outputs might not always be perfect. As you explore functional calling, be prepared for potential inaccuracies or unexpected results.

By leveraging LangChain, Ollama, and the power of LLMs like Phi-3, you can unlock new possibilities for interacting with these advanced AI models. This approach empowers you to create custom applications and workflows tailored to your specific needs.

Conclusion: Bridging the Gap with Functional Calling:

This article series has explored the exciting concept of functional calling with LangChain, Ollama, and Microsoft’s Phi-3 model. We’ve delved into the tools, the setup process, and the implementation of a basic text summarization function as an example.

By implementing functional calling, you effectively bridge the gap between you and the vast capabilities of LLMs. You can treat these models as libraries with specific functionalities, allowing for a more structured and controlled interaction. This approach opens doors to various possibilities:

  • Simplified LLM Interaction: Functional calling removes the need to directly write complex prompts for every interaction. It offers a more user-friendly way to access specific LLM functionalities.
  • Enhanced Control: By defining functions, you dictate how the LLM is used within your workflow. This provides greater control over the interaction and the expected output.
  • Streamlined Workflows: With functional calling, you can chain multiple LLM functionalities together, creating intricate and powerful workflows for various applications.

Looking Ahead:

The potential of functional calling with LangChain and Ollama extends far beyond the provided example. Here are some exciting areas to explore further:

  • Custom LLM Functionalities: As LLM capabilities evolve, the possibilities for defining custom functionalities will grow. Explore ways to tailor LLM functionalities to your specific needs.
  • Integration with Other Tools: LangChain allows for integration with various AI tools and frameworks. Combine functional calling with other AI components to create powerful and comprehensive applications.
  • Community Collaboration: The LangChain and Ollama communities are actively developing and fostering innovation. Participate in discussions and contribute to the advancement of functional calling and LLM accessibility.

By embracing functional calling, you can unlock the true potential of large language models within your projects. Remember, this is just the beginning. As LLMs and their functionalities continue to develop, functional calling will become an even more powerful tool for harnessing the power of AI in a structured and controlled manner.

Additional Resource:

Ollama Official WebSite: https://ollama.com/
Ollama Github: https://github.com/ollama/ollama?tab=readme-ov-file
Cheat Sheet: https://cheatsheet.md/llm-leaderboard/ollama.en
Implementation Code: https://github.com/imanoop7/llm-function-calling/blob/main/loacal-llm/phi-3.ipynb

Feel free to explore these resources, and happy learning!
If you have any more questions, feel free to ask. 😊

If you liked this article and you want to support me:

  1. Clap my article 10 times; that will really help me out.👏
  2. Follow me on Medium and subscribe for Free to get my latest article🫶

--

--

Anoop Maurya

Data Scientist-----Building bridges between data and decisions 😎