Bringing AI Home: Your Private LLM Chat Alternative to ChatGPT

Mohammed Hany Shokry
4 min readAug 24, 2024

--

As a college student, lead engineer, and tech enthusiast, I heavily rely on ChatGPT across many facets of my life — whether for studying, writing essays, completing assignments, working on hobby projects, scripting, preparing presentations, coding, creating documentation, learning new languages, or building projects.

Though I primarily use GPT-3.5 and Sonnet for most tasks, there are times when I need a more advanced language model. Despite subscribing to ChatGPT Plus, I noticed I wasn’t fully utilizing its additional features — I mainly use it for chat augmentation. Additionally, my team is involved in several AI projects, some of which utilize large language models (LLMs). Staying updated with new models and evaluating their capabilities as soon as they’re available is crucial for our work. Consequently, I began searching for a chat interface that allows the specification of a custom API for the LLM, enabling us to use various GPT models through APIs, including open-source solutions.

I stumbled upon LibreChat, and in this article, I’ll walk you through setting up LibreChat with MongoDB for chat history, OpenAI API, and Together AI. This guide will cover a very minimal setup.

Cloning LibreChat

First, clone the LibreChat repository:

git clone https://github.com/danny-avila/LibreChat

I prefer an upstream/origin setup to keep my local clone up to date with the primary LibreChat repository.

To pull the latest updates from LibreChat and push them to your repo, run:

  • git pull upstream main
  • git push

Chat History Database (MongoDB)

For chat histories, I use a free-tier cluster on MongoDB.

  1. Go to MongoDB, create an account, and set up a cluster. Copy the connection string.
  2. For network access, either add your IP and update it when it changes, or allow access from anywhere.

For more details, see MongoDB Atlas Setup.

Uploads

Create folders named uploads and images in your base directory to save uploaded files. Ensure you add these to your .gitignore.

librechat. yaml

The librechat.yaml file allows the integration of custom AI endpoints, enabling connections with any AI provider that is compliant with OpenAI API standards.

Below is my configuration:

version: 1.1.5
endpoints:
custom:
- name: "together.ai"
apiKey: "${TOGETHERAI_API_KEY}"
baseURL: "https://api.together.xyz"
models:
default: [
"meta-llama/Meta-Llama-3.1-405B-Instruct-Turbo",
]
fetch: false
titleConvo: true
titleModel: "togethercomputer/llama-2-7b-chat"
summarize: false
summaryModel: "togethercomputer/llama-2-7b-chat"
forcePrompt: false
modelDisplayLabel: "together.ai"

For further information, refer to librechat.yaml Configuration Guide.

Docker Compose

Create a docker-compose.yml file. LibreChat provides a docker-compose.override.yml.example file that you can use as a reference.

Here is my Docker Compose configuration:

version: "3.4"
services:
api:
container_name: LibreChat
ports:
- "${PORT}:${PORT}"
image: ghcr.io/danny-avila/librechat-dev:latest
restart: always
user: "${UID}:${GID}"
extra_hosts:
- "host.docker.internal:host-gateway"
environment:
- HOST=0.0.0.0
volumes:
- type: bind
source: ./.env
target: /app/.env
- ./images:/app/client/public/images
- ./logs:/app/api/logs
- ./uploads:/app/uploads
- ./librechat.yaml:/app/librechat.yaml

For additional details, refer to Docker Override Configuration.

OpenAI

  • Set up billing and suggest turning off Auto Recharge.
  • Add limits per organization or project ($5 is more than enough for me).
  • Create a service account and copy its token.

For more instructions, see Pre-Configured AI Setup.

Together.AI

  • Create account
  • You will be given 5$ free credit at first
  • If you upgraded to a premium account, make sure you set up a limit

Environment File

Create a .env file. LibreChat provides a .env.example file that lists all possible options. Make sure it’s added to .gitignore

Here is my environment configuration:

# Environment configuration for LibreChat
HOST=localhost
PORT=8795
MONGO_URI=MONGO_DB_URLDOMAIN_CLIENT=http://localhost:3080
DOMAIN_SERVER=http://localhost:3080
NO_INDEX=trueDEBUG_LOGGING=true
DEBUG_CONSOLE=false
UID=1000
GID=1000
CONFIG_PATH="librechat.yaml"ENDPOINTS=openAITOGETHERAI_API_KEY=MY_TOGETHERAI_API_KEYOPENAI_API_KEY=MY_OPEN_AI_KEY
OPENAI_MODELS=gpt-4o
DEBUG_OPENAI=falseCREDS_KEY=f34be427ebb29de8d88c107a71546019685ed8b241d8f2ed00c3df97ad2566f0
CREDS_IV=e2341419ec3dd3d19b13a1a87fafcbfb
SEARCH=falseOPENAI_MODERATION=false
BAN_VIOLATIONS=false
BAN_DURATION=7200000
BAN_INTERVAL=20
LOGIN_VIOLATION_SCORE=1
REGISTRATION_VIOLATION_SCORE=1
CONCURRENT_VIOLATION_SCORE=1
MESSAGE_VIOLATION_SCORE=1
NON_BROWSER_VIOLATION_SCORE=20
LOGIN_MAX=7
LOGIN_WINDOW=5
REGISTER_MAX=5
REGISTER_WINDOW=60
LIMIT_CONCURRENT_MESSAGES=true
CONCURRENT_MESSAGE_MAX=2
LIMIT_MESSAGE_IP=true
MESSAGE_IP_MAX=40
MESSAGE_IP_WINDOW=1
LIMIT_MESSAGE_USER=false
MESSAGE_USER_MAX=40
MESSAGE_USER_WINDOW=1
ILLEGAL_MODEL_REQ_SCORE=5CHECK_BALANCE=falseALLOW_EMAIL_LOGIN=true
ALLOW_REGISTRATION=true
ALLOW_SOCIAL_LOGIN=false
ALLOW_SOCIAL_REGISTRATION=false
SESSION_EXPIRY=900000
REFRESH_TOKEN_EXPIRY=604800000
JWT_SECRET=16f8c0ef4a5d391b26034086c628469d3f9f497f08163ab9b40137092f2909ef
JWT_REFRESH_SECRET=eaa5191f2914e30b9387fd84e254e4ba6fc51b4654968a9b0803b456a54b8418
APP_TITLE=LibreChat
HELP_AND_FAQ_URL=https://librechat.ai

For more details, check the .env Configuration Guide.

Starting Up

Run the following command to start your setup:

docker compose up

Navigate to http://localhost:8795/ to access LibreChat.

Hosting

So far, I haven’t needed to host LibreChat on a remote server; I simply spin it up in Docker whenever needed. For hosting options, see the Hosting Guide.

Customization

LibreChat offers extensive customization options, configurations, and settings. Make sure to explore the official documentation to fully leverage its capabilities.

--

--