Ollama + HuggingFace ✅🔥

Sudarshan Koirala
2 min readFeb 25, 2024

--

Create Custom Models From Huggingface with Ollama

👨🏾‍💻 GitHub ⭐️| 🐦 Twitter | 📹 YouTube | 👔LinkedIn | ☕️Ko-fi

Image by Author

Ollama helps you get up and running with large language models, locally in very easy and simple steps. Compared with Ollama, Huggingface has more than half a million models. Wouldn’t it be cool, if we can create custom models from Huggingface with Ollama ? If your answer is yes, you landed in the right post 😎

If you are new to Ollama, I have created a playlist, take your time to go through it, no pressure !

Here are the steps to create custom models.

  1. Make sure you have Ollama installed and running ( no walking 😄 )
  2. Go to huggingface website and download the model ( I have downloaded the GGUF model )
  3. Create a modelfile and input necessary things.
  4. Create a model out of this modelfile and run it locally in the terminal.

Now, you might be wondering what are the necessary things to input in the modelfile, I know your thinking, here it is 🤗 In this example, I am using the TheBloke/CapybaraHermes-2.5-Mistral-7B-GGUF model.

# Modelfile
FROM "./capybarahermes-2.5-mistral-7b.Q4_K_M.gguf"

PARAMETER stop "<|im_start|>"
PARAMETER stop "<|im_end|>"

TEMPLATE """
<|im_start|>system
{{ .System }}<|im_end|>
<|im_start|>user
{{ .Prompt }}<|im_end|>
<|im_start|>assistant
"""

You might also be thinking how to create and run the custom model, for that too, here it is ✌️

ollama create my-own-model -f Modelfile
ollama run my-own-model

Now, you know how to create a custom model from model hosted in Huggingface with Ollama. Give a try and good luck with it. Still, If you prefer a video walkthrough, here is the link.

👨🏾‍💻 GitHub ⭐️| 🐦 Twitter | 📹 YouTube | 👔LinkedIn | ☕️Ko-fi

Recommended YouTube playlists:

  1. LangChain-Framework-Build-Around-LLMs
  2. 30 Days of Databricks
  3. LlamaIndex Playlist
  4. Ollama Playlist

Thank you for your time in reading this post!

Make sure to leave your feedback and comments. See you in the next blog, stay tuned 📢

--

--