Using Copilot for Obsidian with a local LLM and vector store

Will it work on my consumer-grade laptop?

PKM Explorer
6 min readApr 11, 2024

Since I first read about running local LLMs in an offline environment, I have been fascinated by the idea to try this in the context of my Obsidian PKM system. There are now several Obsidian plugins available that allow you to use local LLMs instead of a commercial LLM provider.

So I decided to see if I could get the Copilot for Obsidian plugin, created by Logan Yang, to run on my Intel i5 Windows laptop with 16GB of RAM and NVIDIA GeForce GTX 1650 graphics card in combination with a local LLM environment.

Installing the plugin

The first step is to install and enable the Copilot plugin from the community plugins list in Obsidian Settings. I chose the Copilot plugin because it not only offers the possibility to have a question and answer dialogue with an LLM, but also lets you create an indexed version of your vault so that you can query the content of your vault.

Copilot has three modes of operation:

  • Chat — default conversation with the installed LLM
  • Long Note QA — to ask questions about the active note in your vault
  • Vault QA (beta) — to ask questions about all information in your vault, based on an indexed version of the vault.

--

--