Combining Sider AI and Deepseek-R1 LLM: A Local Integration Tutorial
Hello, in this article, I’ll demonstrate how to connect Deepseek-R1, which is running locally, to the Sider AI browser extension using its external API integration feature.
The method we use here can also be applied to other LLMs (Large Language Models). Similarly, by leveraging this approach, you can integrate your own LLMs into other tools that allow API usage.
Steps:
- Download and Install the Sider.ai Extension (We will be using Chrome)
- Install Ollama (We’ll use Ollama as it speeds up the process of launching the LLM. Alternatively, you can start the LLM using other methods as well.)
- Configure Ollama
- Configure Sider.ai
- Usage
Step 1 — Download and Install the Sider.ai Extension
Go to https://sider.ai/download and complete the download and installation process.
Once the download is complete, create a user account for Sider and log in (the extension cannot be used without logging in).
Step 2— Install Ollama
Why are we using Ollama?
Ollama provides an OpenAI-compatible API, which allows it to work seamlessly with Sider. This enables us to quickly deploy any LLM we choose, using a standardized API.
Visit https://ollama.com/download and download Ollama.
I go to the site with Ollama’s models and select the Deepseek-R1 model. https://ollama.com/library/deepseek-r1
Here, I will use the 14b model. Depending on your computer’s hardware specifications, you can choose a lower or higher model.
To download this model to my computer, I type the following command in the terminal: ollama pull deepseek-r1:14b
, and Ollama downloads the model to my system.
Step 3— Configure Ollama
In order to access the LLM models on the machine, Ollama’s API needs to be running.
To do this, we use the ollama serve
command.
However, we are customizing the execution with specific parameters.
OLLAMA_HOST=0.0.0.0:5050 OLLAMA_ORIGINS=chrome-extension://* ollama serve
We will run Ollama using this command.
OLLAMA_HOST=0.0.0.0:5050
=> This tells Ollama that we can access it from any IP address on the machine and use port 5050. You can change the port as needed.
OLLAMA_ORIGINS=chrome-extension://*
=> Here, we add the machine template that will allow access to Ollama. Since our Chrome extension will be accessing it, we need to add this parameter. As shown in the picture, when making requests to the Sider API, this will be used as the origin. If you dont do this, you get 403 errors.
These are defined as environment variables for this session, and they are used when the ollama serve
command is executed.
Step 4— Configure Sider
To configure Sider, go to the extension settings. In the General settings, select Custom API Key as the Service provider.
From the dropdown, configure the Deepseek settings.
- ApiKey => You can enter “ollama”. The value you enter won’t be used, as the Ollama API does not have a security system in place. Sider does not allow you to leave this field blank.
- Api Proxy URL =>
http://127.0.0.1:PORT/v1
. Adjust the port number according to your configuration. - Model List => Enter the exact name of the Deepseek model you are using. For example, in this tutorial, we are using
deepseek:r1:14b
. This is a crucial step, as the API request will use the model name. If you don't define it, you will encounter an error.
Step 5— Usage
We can now select the model we set up from the chat interface and start using it.