List of Different Ways to Run LLMs Locally

Dr. Walid Soula
ILLUMINATION
6 min readMar 26, 2024

--

In this article we will see different ways to run any LLMs locally, Pin this article so you can test everything or go back when needed.

If you find my articles interesting, don’t forget to clap and follow 👍🏼, these articles take times and effort to do!

From the list that I will provide, I mainly use LMStudio, Ollama, Jan.ai, and HuggingFace. A special mention to Pinokio, a very good platform!

1/ LMStudio

LM Studio is a desktop application for running local LLMs on your computer. Link: https://lmstudio.ai/

2/ Ollama

Ollama is a tool that allows you to run open-source large language models (LLMs) locally on your machine. It supports a variety of models, including Llama 2, Code Llama, and others. It bundles model weights, configuration, and data into a single package, defined by a Modelfile. Link: https://ollama.com/

3/ Hugging Face and Transformers

Hugging Face is the Docker Hub equivalent for Machine Learning and AI, offering an overwhelming array of open-source models. Hugging Face also provides transformers, a Python library that streamlines running a LLM locally. Example how to run Phi 2 from Microsoft

import torch
from…

--

--