Strix LLM Documentation and Config Changes (#315)
* feat: add to readme new keys * feat: shoutout strix models, docs * fix: mypy error * fix: base api * docs: update quickstart and models * fixes: changes to docs uniform api_key variable naming * test: git commit hook * nevermind it was nothing * docs: Update default model to claude-sonnet-4.6 and improve Strix Router docs - Replace gpt-5 and opus-4.6 defaults with claude-sonnet-4.6 across all docs and code - Rewrite Strix Router (models.mdx) page with clearer structure and messaging - Add Strix Router as recommended option in overview.mdx and quickstart prerequisites - Update stale Claude 4.5 references to 4.6 in anthropic.mdx, openrouter.mdx, bug_report.md - Fix install.sh links to point to models.strix.ai and correct docs URLs - Update error message examples in main.py to use claude-sonnet-4-6 --------- Co-authored-by: 0xallam <ahmed39652003@gmail.com>
This commit is contained in:
80
docs/llm-providers/models.mdx
Normal file
80
docs/llm-providers/models.mdx
Normal file
@@ -0,0 +1,80 @@
|
||||
---
|
||||
title: "Strix Router"
|
||||
description: "Access top LLMs through a single API with high rate limits and zero data retention"
|
||||
---
|
||||
|
||||
Strix Router gives you access to the best LLMs through a single API key.
|
||||
|
||||
<Note>
|
||||
Strix Router is currently in **beta**. It's completely optional — Strix works with any [LiteLLM-compatible provider](/llm-providers/overview) using your own API keys, or with [local models](/llm-providers/local). Strix Router is just the setup we test and optimize for.
|
||||
</Note>
|
||||
|
||||
## Why Use Strix Router?
|
||||
|
||||
- **High rate limits** — No throttling during long-running scans
|
||||
- **Zero data retention** — Routes to providers with zero data retention policies enabled
|
||||
- **Failover & load balancing** — Automatic fallback across providers for reliability
|
||||
- **Simple setup** — One API key, one environment variable, no provider accounts needed
|
||||
- **No markup** — Same token pricing as the underlying providers, no extra fees
|
||||
- **$10 free credit** — Try it free on signup, no credit card required
|
||||
|
||||
## Quick Start
|
||||
|
||||
1. Get your API key at [models.strix.ai](https://models.strix.ai)
|
||||
2. Set your environment:
|
||||
|
||||
```bash
|
||||
export LLM_API_KEY='your-strix-api-key'
|
||||
export STRIX_LLM='strix/claude-sonnet-4.6'
|
||||
```
|
||||
|
||||
3. Run a scan:
|
||||
|
||||
```bash
|
||||
strix --target ./your-app
|
||||
```
|
||||
|
||||
## Available Models
|
||||
|
||||
### Anthropic
|
||||
|
||||
| Model | ID |
|
||||
|-------|-----|
|
||||
| Claude Sonnet 4.6 | `strix/claude-sonnet-4.6` |
|
||||
| Claude Opus 4.6 | `strix/claude-opus-4.6` |
|
||||
|
||||
### OpenAI
|
||||
|
||||
| Model | ID |
|
||||
|-------|-----|
|
||||
| GPT-5.2 | `strix/gpt-5.2` |
|
||||
| GPT-5.1 | `strix/gpt-5.1` |
|
||||
| GPT-5 | `strix/gpt-5` |
|
||||
| GPT-5.2 Codex | `strix/gpt-5.2-codex` |
|
||||
| GPT-5.1 Codex Max | `strix/gpt-5.1-codex-max` |
|
||||
| GPT-5.1 Codex | `strix/gpt-5.1-codex` |
|
||||
| GPT-5 Codex | `strix/gpt-5-codex` |
|
||||
|
||||
### Google
|
||||
|
||||
| Model | ID |
|
||||
|-------|-----|
|
||||
| Gemini 3 Pro | `strix/gemini-3-pro-preview` |
|
||||
| Gemini 3 Flash | `strix/gemini-3-flash-preview` |
|
||||
|
||||
### Other
|
||||
|
||||
| Model | ID |
|
||||
|-------|-----|
|
||||
| GLM-5 | `strix/glm-5` |
|
||||
| GLM-4.7 | `strix/glm-4.7` |
|
||||
|
||||
## Configuration Reference
|
||||
|
||||
<ParamField path="LLM_API_KEY" type="string" required>
|
||||
Your Strix API key from [models.strix.ai](https://models.strix.ai).
|
||||
</ParamField>
|
||||
|
||||
<ParamField path="STRIX_LLM" type="string" required>
|
||||
Model ID from the tables above. Must be prefixed with `strix/`.
|
||||
</ParamField>
|
||||
Reference in New Issue
Block a user