Integrate DeepSeek with LibreChat Guide | Generated by AI
Here’s the fastest way to get DeepSeek working in LibreChat. I’ll show two options—use the official DeepSeek API (recommended), or go through OpenRouter.
Option A — Use DeepSeek’s official API (recommended)
-
Get a DeepSeek API key Create one in your DeepSeek account, then note the base URL:
https://api.deepseek.com
(or OpenAI-compatiblehttps://api.deepseek.com/v1
). (DeepSeek API Docs, deepseek.apidog.io) -
Add a DeepSeek endpoint in
librechat.yaml
Put this underendpoints:
→custom:
:
- name: deepseek
apiKey: ${DEEPSEEK_API_KEY}
baseURL: https://api.deepseek.com/v1
models:
default: deepseek-chat
fetch: true
list:
- deepseek-chat # V3 (general)
- deepseek-coder # code-centric
- deepseek-reasoner # R1 reasoning
titleConvo: true
dropParams: null
LibreChat ships a DeepSeek config guide and confirms model names (deepseek-chat
, deepseek-coder
, deepseek-reasoner
) and notes about R1 streaming its “thought process.” (LibreChat)
- Expose the API key via
.env
In your LibreChat.env
file:
DEEPSEEK_API_KEY=sk-...
LibreChat supports custom OpenAI-compatible providers via librechat.yaml
+ .env
. (LibreChat)
- Restart your stack From your LibreChat folder:
docker compose down
docker compose up -d --build
(Needed so the API container reloads librechat.yaml
and .env
.) If your custom endpoints don’t appear, check the api
container logs for config errors. (GitHub)
Option B — Use DeepSeek via OpenRouter
If you already use OpenRouter, just register the DeepSeek models in an OpenRouter endpoint block.
librechat.yaml
:
- name: openrouter
apiKey: ${OPENROUTER_KEY}
baseURL: https://openrouter.ai/api/v1
models:
default: deepseek/deepseek-chat
list:
- deepseek/deepseek-chat
- deepseek/deepseek-coder
- deepseek/deepseek-reasoner
Two important notes from LibreChat docs:
• Don’t set OPENROUTER_API_KEY
env var name (use a different name like OPENROUTER_KEY
) or you’ll accidentally override the OpenAI endpoint.
• OpenRouter is first-class in LibreChat’s custom endpoints list. (LibreChat)
OpenRouter exposes DeepSeek models with an OpenAI-compatible surface. (OpenRouter)
Tips & gotchas
- R1 /
deepseek-reasoner
: It can stream its chain-of-thought (“thought process”). Some OpenAI parameters may not apply. If you see odd output, test withdeepseek-chat
first. (LibreChat) - Model fetching: Setting
fetch: true
lets LibreChat pull the latest model list when supported; otherwise thelist:
you specify is used. (LibreChat) - Where these settings live:
librechat.yaml
is the custom endpoints file;.env
carries keys and proxy settings. (LibreChat)
If you want, paste your current librechat.yaml
and I’ll patch it to add DeepSeek cleanly (no tables, I promise).