Strix LLM Documentation and Config Changes (#315)
* feat: add to readme new keys * feat: shoutout strix models, docs * fix: mypy error * fix: base api * docs: update quickstart and models * fixes: changes to docs uniform api_key variable naming * test: git commit hook * nevermind it was nothing * docs: Update default model to claude-sonnet-4.6 and improve Strix Router docs - Replace gpt-5 and opus-4.6 defaults with claude-sonnet-4.6 across all docs and code - Rewrite Strix Router (models.mdx) page with clearer structure and messaging - Add Strix Router as recommended option in overview.mdx and quickstart prerequisites - Update stale Claude 4.5 references to 4.6 in anthropic.mdx, openrouter.mdx, bug_report.md - Fix install.sh links to point to models.strix.ai and correct docs URLs - Update error message examples in main.py to use claude-sonnet-4-6 --------- Co-authored-by: 0xallam <ahmed39652003@gmail.com>
This commit is contained in:
@@ -8,7 +8,7 @@ Configure Strix using environment variables or a config file.
|
||||
## LLM Configuration
|
||||
|
||||
<ParamField path="STRIX_LLM" type="string" required>
|
||||
Model name in LiteLLM format (e.g., `openai/gpt-5`, `anthropic/claude-sonnet-4-5`).
|
||||
Model name in LiteLLM format (e.g., `anthropic/claude-sonnet-4-6`, `openai/gpt-5`).
|
||||
</ParamField>
|
||||
|
||||
<ParamField path="LLM_API_KEY" type="string">
|
||||
@@ -86,7 +86,7 @@ strix --target ./app --config /path/to/config.json
|
||||
```json
|
||||
{
|
||||
"env": {
|
||||
"STRIX_LLM": "openai/gpt-5",
|
||||
"STRIX_LLM": "anthropic/claude-sonnet-4-6",
|
||||
"LLM_API_KEY": "sk-...",
|
||||
"STRIX_REASONING_EFFORT": "high"
|
||||
}
|
||||
@@ -97,7 +97,7 @@ strix --target ./app --config /path/to/config.json
|
||||
|
||||
```bash
|
||||
# Required
|
||||
export STRIX_LLM="openai/gpt-5"
|
||||
export STRIX_LLM="anthropic/claude-sonnet-4-6"
|
||||
export LLM_API_KEY="sk-..."
|
||||
|
||||
# Optional: Enable web search
|
||||
|
||||
@@ -32,7 +32,7 @@ description: "Contribute to Strix development"
|
||||
</Step>
|
||||
<Step title="Configure LLM">
|
||||
```bash
|
||||
export STRIX_LLM="openai/gpt-5"
|
||||
export STRIX_LLM="anthropic/claude-sonnet-4-6"
|
||||
export LLM_API_KEY="your-api-key"
|
||||
```
|
||||
</Step>
|
||||
|
||||
@@ -78,7 +78,7 @@ Strix uses a graph of specialized agents for comprehensive security testing:
|
||||
curl -sSL https://strix.ai/install | bash
|
||||
|
||||
# Configure
|
||||
export STRIX_LLM="openai/gpt-5"
|
||||
export STRIX_LLM="anthropic/claude-sonnet-4-6"
|
||||
export LLM_API_KEY="your-api-key"
|
||||
|
||||
# Scan
|
||||
|
||||
@@ -35,7 +35,7 @@ Add these secrets to your repository:
|
||||
|
||||
| Secret | Description |
|
||||
|--------|-------------|
|
||||
| `STRIX_LLM` | Model name (e.g., `openai/gpt-5`) |
|
||||
| `STRIX_LLM` | Model name (e.g., `anthropic/claude-sonnet-4-6`) |
|
||||
| `LLM_API_KEY` | API key for your LLM provider |
|
||||
|
||||
## Exit Codes
|
||||
|
||||
@@ -6,7 +6,7 @@ description: "Configure Strix with Claude models"
|
||||
## Setup
|
||||
|
||||
```bash
|
||||
export STRIX_LLM="anthropic/claude-sonnet-4-5"
|
||||
export STRIX_LLM="anthropic/claude-sonnet-4-6"
|
||||
export LLM_API_KEY="sk-ant-..."
|
||||
```
|
||||
|
||||
@@ -14,8 +14,8 @@ export LLM_API_KEY="sk-ant-..."
|
||||
|
||||
| Model | Description |
|
||||
|-------|-------------|
|
||||
| `anthropic/claude-sonnet-4-5` | Best balance of intelligence and speed (recommended) |
|
||||
| `anthropic/claude-opus-4-5` | Maximum capability for deep analysis |
|
||||
| `anthropic/claude-sonnet-4-6` | Best balance of intelligence and speed (recommended) |
|
||||
| `anthropic/claude-opus-4-6` | Maximum capability for deep analysis |
|
||||
|
||||
## Get API Key
|
||||
|
||||
|
||||
80
docs/llm-providers/models.mdx
Normal file
80
docs/llm-providers/models.mdx
Normal file
@@ -0,0 +1,80 @@
|
||||
---
|
||||
title: "Strix Router"
|
||||
description: "Access top LLMs through a single API with high rate limits and zero data retention"
|
||||
---
|
||||
|
||||
Strix Router gives you access to the best LLMs through a single API key.
|
||||
|
||||
<Note>
|
||||
Strix Router is currently in **beta**. It's completely optional — Strix works with any [LiteLLM-compatible provider](/llm-providers/overview) using your own API keys, or with [local models](/llm-providers/local). Strix Router is just the setup we test and optimize for.
|
||||
</Note>
|
||||
|
||||
## Why Use Strix Router?
|
||||
|
||||
- **High rate limits** — No throttling during long-running scans
|
||||
- **Zero data retention** — Routes to providers with zero data retention policies enabled
|
||||
- **Failover & load balancing** — Automatic fallback across providers for reliability
|
||||
- **Simple setup** — One API key, one environment variable, no provider accounts needed
|
||||
- **No markup** — Same token pricing as the underlying providers, no extra fees
|
||||
- **$10 free credit** — Try it free on signup, no credit card required
|
||||
|
||||
## Quick Start
|
||||
|
||||
1. Get your API key at [models.strix.ai](https://models.strix.ai)
|
||||
2. Set your environment:
|
||||
|
||||
```bash
|
||||
export LLM_API_KEY='your-strix-api-key'
|
||||
export STRIX_LLM='strix/claude-sonnet-4.6'
|
||||
```
|
||||
|
||||
3. Run a scan:
|
||||
|
||||
```bash
|
||||
strix --target ./your-app
|
||||
```
|
||||
|
||||
## Available Models
|
||||
|
||||
### Anthropic
|
||||
|
||||
| Model | ID |
|
||||
|-------|-----|
|
||||
| Claude Sonnet 4.6 | `strix/claude-sonnet-4.6` |
|
||||
| Claude Opus 4.6 | `strix/claude-opus-4.6` |
|
||||
|
||||
### OpenAI
|
||||
|
||||
| Model | ID |
|
||||
|-------|-----|
|
||||
| GPT-5.2 | `strix/gpt-5.2` |
|
||||
| GPT-5.1 | `strix/gpt-5.1` |
|
||||
| GPT-5 | `strix/gpt-5` |
|
||||
| GPT-5.2 Codex | `strix/gpt-5.2-codex` |
|
||||
| GPT-5.1 Codex Max | `strix/gpt-5.1-codex-max` |
|
||||
| GPT-5.1 Codex | `strix/gpt-5.1-codex` |
|
||||
| GPT-5 Codex | `strix/gpt-5-codex` |
|
||||
|
||||
### Google
|
||||
|
||||
| Model | ID |
|
||||
|-------|-----|
|
||||
| Gemini 3 Pro | `strix/gemini-3-pro-preview` |
|
||||
| Gemini 3 Flash | `strix/gemini-3-flash-preview` |
|
||||
|
||||
### Other
|
||||
|
||||
| Model | ID |
|
||||
|-------|-----|
|
||||
| GLM-5 | `strix/glm-5` |
|
||||
| GLM-4.7 | `strix/glm-4.7` |
|
||||
|
||||
## Configuration Reference
|
||||
|
||||
<ParamField path="LLM_API_KEY" type="string" required>
|
||||
Your Strix API key from [models.strix.ai](https://models.strix.ai).
|
||||
</ParamField>
|
||||
|
||||
<ParamField path="STRIX_LLM" type="string" required>
|
||||
Model ID from the tables above. Must be prefixed with `strix/`.
|
||||
</ParamField>
|
||||
@@ -19,7 +19,7 @@ Access any model on OpenRouter using the format `openrouter/<provider>/<model>`:
|
||||
| Model | Configuration |
|
||||
|-------|---------------|
|
||||
| GPT-5 | `openrouter/openai/gpt-5` |
|
||||
| Claude 4.5 Sonnet | `openrouter/anthropic/claude-sonnet-4.5` |
|
||||
| Claude Sonnet 4.6 | `openrouter/anthropic/claude-sonnet-4.6` |
|
||||
| Gemini 3 Pro | `openrouter/google/gemini-3-pro-preview` |
|
||||
| GLM-4.7 | `openrouter/z-ai/glm-4.7` |
|
||||
|
||||
|
||||
@@ -5,31 +5,54 @@ description: "Configure your AI model for Strix"
|
||||
|
||||
Strix uses [LiteLLM](https://docs.litellm.ai/docs/providers) for model compatibility, supporting 100+ LLM providers.
|
||||
|
||||
## Recommended Models
|
||||
## Strix Router (Recommended)
|
||||
|
||||
For best results, use one of these models:
|
||||
The fastest way to get started. [Strix Router](/llm-providers/models) gives you access to tested models with the highest rate limits and zero data retention.
|
||||
|
||||
```bash
|
||||
export STRIX_LLM="strix/claude-sonnet-4.6"
|
||||
export LLM_API_KEY="your-strix-api-key"
|
||||
```
|
||||
|
||||
Get your API key at [models.strix.ai](https://models.strix.ai).
|
||||
|
||||
## Bring Your Own Key
|
||||
|
||||
You can also use any LiteLLM-compatible provider with your own API keys:
|
||||
|
||||
| Model | Provider | Configuration |
|
||||
| ----------------- | ------------- | -------------------------------- |
|
||||
| Claude Sonnet 4.6 | Anthropic | `anthropic/claude-sonnet-4-6` |
|
||||
| GPT-5 | OpenAI | `openai/gpt-5` |
|
||||
| Claude 4.5 Sonnet | Anthropic | `anthropic/claude-sonnet-4-5` |
|
||||
| Gemini 3 Pro | Google Vertex | `vertex_ai/gemini-3-pro-preview` |
|
||||
|
||||
## Quick Setup
|
||||
|
||||
```bash
|
||||
export STRIX_LLM="openai/gpt-5"
|
||||
export STRIX_LLM="anthropic/claude-sonnet-4-6"
|
||||
export LLM_API_KEY="your-api-key"
|
||||
```
|
||||
|
||||
## Local Models
|
||||
|
||||
Run models locally with [Ollama](https://ollama.com), [LM Studio](https://lmstudio.ai), or any OpenAI-compatible server:
|
||||
|
||||
```bash
|
||||
export STRIX_LLM="ollama/llama4"
|
||||
export LLM_API_BASE="http://localhost:11434"
|
||||
```
|
||||
|
||||
See the [Local Models guide](/llm-providers/local) for setup instructions and recommended models.
|
||||
|
||||
## Provider Guides
|
||||
|
||||
<CardGroup cols={2}>
|
||||
<Card title="Strix Router" href="/llm-providers/models">
|
||||
Recommended models router with high rate limits.
|
||||
</Card>
|
||||
<Card title="OpenAI" href="/llm-providers/openai">
|
||||
GPT-5 and Codex models.
|
||||
</Card>
|
||||
<Card title="Anthropic" href="/llm-providers/anthropic">
|
||||
Claude 4.5 Sonnet, Opus, and Haiku.
|
||||
Claude Sonnet 4.6, Opus, and Haiku.
|
||||
</Card>
|
||||
<Card title="OpenRouter" href="/llm-providers/openrouter">
|
||||
Access 100+ models through a single API.
|
||||
@@ -38,7 +61,7 @@ export LLM_API_KEY="your-api-key"
|
||||
Gemini 3 models via Google Cloud.
|
||||
</Card>
|
||||
<Card title="AWS Bedrock" href="/llm-providers/bedrock">
|
||||
Claude 4.5 and Titan models via AWS.
|
||||
Claude and Titan models via AWS.
|
||||
</Card>
|
||||
<Card title="Azure OpenAI" href="/llm-providers/azure">
|
||||
GPT-5 via Azure.
|
||||
@@ -53,8 +76,8 @@ export LLM_API_KEY="your-api-key"
|
||||
Use LiteLLM's `provider/model-name` format:
|
||||
|
||||
```
|
||||
anthropic/claude-sonnet-4-6
|
||||
openai/gpt-5
|
||||
anthropic/claude-sonnet-4-5
|
||||
vertex_ai/gemini-3-pro-preview
|
||||
bedrock/anthropic.claude-4-5-sonnet-20251022-v1:0
|
||||
ollama/llama4
|
||||
|
||||
@@ -6,7 +6,7 @@ description: "Install Strix and run your first security scan"
|
||||
## Prerequisites
|
||||
|
||||
- Docker (running)
|
||||
- An LLM provider API key (OpenAI, Anthropic, or local model)
|
||||
- An LLM API key — use [Strix Router](/llm-providers/models) for the easiest setup, or bring your own key from any [supported provider](/llm-providers/overview)
|
||||
|
||||
## Installation
|
||||
|
||||
@@ -27,13 +27,23 @@ description: "Install Strix and run your first security scan"
|
||||
|
||||
Set your LLM provider:
|
||||
|
||||
```bash
|
||||
export STRIX_LLM="openai/gpt-5"
|
||||
export LLM_API_KEY="your-api-key"
|
||||
```
|
||||
<Tabs>
|
||||
<Tab title="Strix Router">
|
||||
```bash
|
||||
export STRIX_LLM="strix/claude-sonnet-4.6"
|
||||
export LLM_API_KEY="your-strix-api-key"
|
||||
```
|
||||
</Tab>
|
||||
<Tab title="Bring Your Own Key">
|
||||
```bash
|
||||
export STRIX_LLM="anthropic/claude-sonnet-4-6"
|
||||
export LLM_API_KEY="your-api-key"
|
||||
```
|
||||
</Tab>
|
||||
</Tabs>
|
||||
|
||||
<Tip>
|
||||
For best results, use `openai/gpt-5`, `anthropic/claude-sonnet-4-5`, or `vertex_ai/gemini-3-pro-preview`.
|
||||
For best results, use `strix/claude-sonnet-4.6`, `strix/claude-opus-4.6`, or `strix/gpt-5.2`.
|
||||
</Tip>
|
||||
|
||||
## Run Your First Scan
|
||||
|
||||
Reference in New Issue
Block a user