Running Open Source LLMs Locally Using Ollama: A Step-by-Step Guide

Kamal Dhungana
6 min readOct 12, 2023
Image created by DALL.E3 (Bing Image Creator)

Running open-source large language models on our personal computer can be quite tricky. It involves dealing with lots of technical settings, managing environment, and needing a lot of storage space. But now, there’s Ollama, which makes things much easier. Ollama does most of the hard work for us, so we can run these big language models on PC without all the hassle. It bundles everything we need. This simplifies the setup and helps our computer use its resources efficiently. So, thanks to Ollama, running open-source large language models, such as LLaMA2, is now a breeze.

In this article, I’ll guide you through the process of running open-source large language models on our PC using the Ollama package. Please note that currently, Ollama is compatible with macOS and Linux systems, but there is a plan for…

--

--

Kamal Dhungana

Data scientist with a passion for AI, Regularly blogging about LLM and OpenAI's innovations,Sharing insights for AI community growth