Configuring Codex CLI for OpenRouter | Generated by AI

Home 2025.11

Setting Up Codex CLI Configuration for OpenRouter

Codex CLI (from OpenAI) is a terminal-based coding agent that supports custom model providers like OpenRouter via its configuration file. OpenRouter provides an OpenAI-compatible API, so you can route requests through it by defining a provider with the appropriate base URL. This allows access to a wider range of models (e.g., from Anthropic, Google, or open-source providers) while using Codex.

The configuration is stored in a TOML file at ~/.codex/config.toml (create it if it doesn’t exist). You’ll define a model provider section for OpenRouter and then reference it in a profile for specific models.

Step 1: Get Your OpenRouter API Key

Step 2: Edit the Config File

Open ~/.codex/config.toml in your editor and add the following sections. This sets the base URL to OpenRouter’s endpoint (https://openrouter.ai/api/v1), which is OpenAI-compatible (Codex appends /chat/completions automatically).

# Define the OpenRouter provider
[model_providers.openrouter]
name = "OpenRouter"
base_url = "https://openrouter.ai/api/v1"
env_key = "OPENROUTER_API_KEY"  # Reads from your env var for auth

# Define a profile using this provider (example: using a GPT-like model)
[profiles.openrouter-gpt]
model_provider = "openrouter"
model = "openai/gpt-4o-mini"  # Replace with any OpenRouter model ID, e.g., "anthropic/claude-3.5-sonnet"

Step 3: Use the Configuration

Troubleshooting

This setup keeps your requests compatible with Codex’s OpenAI-style API while leveraging OpenRouter’s ecosystem. For more details, see the Codex CLI docs on npm or OpenRouter’s API guide. If you’re using a fork like Open Codex, the config is similar but may have extra Gemini support.


Back

x-ai/grok-4-fast

Donate