Solving Fabric and Local Ollama Context Issues: A Step-by-Step Guide

Marcelo Busana
2 min readJul 25, 2024

--

If you’re facing challenges running Fabric and Ollama locally, this guide is for you. Let’s dive into a common issue and how to solve it.

The Problem

When running a command like:

fabric --remoteOllamaServer 192.168.1.100 --model "llama3.1:latest" -sp extract_wisdom < video_transcript.txt

The outcome does not follow the correct pattern determined in extract_wisdom pattern.

The root cause of this issue is the context window.

Adjusting the Context Window

First, you can adjust the context window parameter when running Ollama in the console. For example:

ollama run llama3.1
>>> /set parameter num_ctx 4096
Set parameter 'num_ctx' to '4096'

This adjusts the context window size, allowing you to execute longer prompts. However, when using Fabric, you cannot include the parameter directly. So, how do we overcome this limitation? Simple — create a custom Llama model with the new parameter.

Create a Custom Model

Here’s how you can do it in a few easy steps:

1. Create a Configuration File
Using a text editor like Nano, create a file with the following content and name it llama3.1_ctx_4096:

FROM llama3.1:latest 
PARAMETER num_ctx 4096

2. Build the New Model
Use the Ollama create command:

ollama create llama3.1_ctx_4096 -f llama3.1_ctx_4096

3. Now, you can run Fabric with the new model:

fabric --remoteOllamaServer 192.168.1.100 --model "llama3.1_ctx_4096:latest" -sp extract_wisdom < video_transcript.txt

The output should now be correct. You can also use this personalized model in any other application by selecting your new model.

Important Considerations

Increasing the context window allows the model to consider more context, improving the coherence of longer texts. However, it also requires more computational resources. Conversely, reducing this value can speed up the generation process but might result in less coherent or contextually aware outputs for longer texts.

Conclusion

I hope this guide was helpful! If you have any questions or need further assistance, drop your comments below. Don’t forget to like and share. See you in the next post!

--

--