Run Your Local AI with Ollama

Ria Banerjee
3 min readMay 21, 2024

--

Ollama is an amazing tool! Using Ollama, you can run several LLMs locally on your system. It’s like your own private ChatGPT — isn’t that exciting?

If you want to get Ollama, simply go to https://ollama.com/ and download the relevant version. At present, only a preview version is available for Windows.

Once you have downloaded and installed Ollama, check if Ollama is running by accessing http://localhost:11434.

Ollama has a library of LLMs from which you can search and select the model you like.

You can also get models from https://huggingface.co/

To download a model, go to terminal and run:

ollama pull <modelname>

Once you have downloaded, run the following command:

ollama run <modelname>

Voila! your LLM is running and can answer your questions.

If you feel you need a UI like ChatGPT, you can do that too using Open WebUI. Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs.

You can find more info about OpenWebUI here.

Install Docker and run the following command:

docker run -d -p 3000:8080 — add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data — name open-webui — restart always ghcr.io/open-webui/open-webui:main

Open http://localhost:3000 and you have your UI. You can register yourself, select your model and start asking questions.

Ollama has a lot of different options. You can copy models and customize them, create your own models and run them on your local system. Here the list of things Ollama can do:

In the upcoming post I’m going to talk about how to use RAG techniques and interact with our own customized data using an LLM in Ollama. For that purpose, I’m going to use a tool called Verba. More about Verba in the next article.

--

--

Ria Banerjee

I am a cyber security professional. I have done a CEH certification. I enjoy cracking CTF challenges, learning and sharing new things on cyber security.