--- title: "OpenRouter" description: "Configure Strix with models via OpenRouter" --- [OpenRouter](https://openrouter.ai) provides access to 100+ models from multiple providers through a single API. ## Setup ```bash export STRIX_LLM="openrouter/openai/gpt-5.4" export LLM_API_KEY="sk-or-..." ``` ## Available Models Access any model on OpenRouter using the format `openrouter//`: | Model | Configuration | |-------|---------------| | GPT-5.4 | `openrouter/openai/gpt-5.4` | | Claude Sonnet 4.6 | `openrouter/anthropic/claude-sonnet-4.6` | | Gemini 3 Pro | `openrouter/google/gemini-3-pro-preview` | | GLM-4.7 | `openrouter/z-ai/glm-4.7` | ## Get API Key 1. Go to [openrouter.ai](https://openrouter.ai) 2. Sign in and navigate to Keys 3. Create a new API key ## Benefits - **Single API** — Access models from OpenAI, Anthropic, Google, Meta, and more - **Fallback routing** — Automatic failover between providers - **Cost tracking** — Monitor usage across all models - **Higher rate limits** — OpenRouter handles provider limits for you