diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index e272e0b..d6b9a64 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -30,7 +30,7 @@ Thank you for your interest in contributing to Strix! This guide will help you g
3. **Configure your LLM provider**
```bash
- export STRIX_LLM="anthropic/claude-sonnet-4-6"
+ export STRIX_LLM="openai/gpt-5"
export LLM_API_KEY="your-api-key"
```
diff --git a/README.md b/README.md
index fd79f86..e13edc0 100644
--- a/README.md
+++ b/README.md
@@ -86,7 +86,7 @@ curl -sSL https://strix.ai/install | bash
pipx install strix-agent
# Configure your AI provider
-export STRIX_LLM="anthropic/claude-sonnet-4-6" # or "strix/claude-sonnet-4.6" via Strix Router (https://models.strix.ai)
+export STRIX_LLM="openai/gpt-5" # or "strix/gpt-5" via Strix Router (https://models.strix.ai)
export LLM_API_KEY="your-api-key"
# Run your first security assessment
@@ -203,7 +203,7 @@ jobs:
### Configuration
```bash
-export STRIX_LLM="anthropic/claude-sonnet-4-6"
+export STRIX_LLM="openai/gpt-5"
export LLM_API_KEY="your-api-key"
# Optional
@@ -217,8 +217,8 @@ export STRIX_REASONING_EFFORT="high" # control thinking effort (default: high,
**Recommended models for best results:**
-- [Anthropic Claude Sonnet 4.6](https://claude.com/platform/api) — `anthropic/claude-sonnet-4-6`
- [OpenAI GPT-5](https://openai.com/api/) — `openai/gpt-5`
+- [Anthropic Claude Sonnet 4.6](https://claude.com/platform/api) — `anthropic/claude-sonnet-4-6`
- [Google Gemini 3 Pro Preview](https://cloud.google.com/vertex-ai) — `vertex_ai/gemini-3-pro-preview`
See the [LLM Providers documentation](https://docs.strix.ai/llm-providers/overview) for all supported providers including Vertex AI, Bedrock, Azure, and local models.
diff --git a/docs/advanced/configuration.mdx b/docs/advanced/configuration.mdx
index bd71c68..d82f822 100644
--- a/docs/advanced/configuration.mdx
+++ b/docs/advanced/configuration.mdx
@@ -8,7 +8,7 @@ Configure Strix using environment variables or a config file.
## LLM Configuration
- Model name in LiteLLM format (e.g., `anthropic/claude-sonnet-4-6`, `openai/gpt-5`).
+ Model name in LiteLLM format (e.g., `openai/gpt-5`, `anthropic/claude-sonnet-4-6`).
@@ -86,7 +86,7 @@ strix --target ./app --config /path/to/config.json
```json
{
"env": {
- "STRIX_LLM": "anthropic/claude-sonnet-4-6",
+ "STRIX_LLM": "openai/gpt-5",
"LLM_API_KEY": "sk-...",
"STRIX_REASONING_EFFORT": "high"
}
@@ -97,7 +97,7 @@ strix --target ./app --config /path/to/config.json
```bash
# Required
-export STRIX_LLM="anthropic/claude-sonnet-4-6"
+export STRIX_LLM="openai/gpt-5"
export LLM_API_KEY="sk-..."
# Optional: Enable web search
diff --git a/docs/contributing.mdx b/docs/contributing.mdx
index ffa3192..b2e50a0 100644
--- a/docs/contributing.mdx
+++ b/docs/contributing.mdx
@@ -32,7 +32,7 @@ description: "Contribute to Strix development"
```bash
- export STRIX_LLM="anthropic/claude-sonnet-4-6"
+ export STRIX_LLM="openai/gpt-5"
export LLM_API_KEY="your-api-key"
```
diff --git a/docs/index.mdx b/docs/index.mdx
index 14de192..ef5ab9a 100644
--- a/docs/index.mdx
+++ b/docs/index.mdx
@@ -78,7 +78,7 @@ Strix uses a graph of specialized agents for comprehensive security testing:
curl -sSL https://strix.ai/install | bash
# Configure
-export STRIX_LLM="anthropic/claude-sonnet-4-6"
+export STRIX_LLM="openai/gpt-5"
export LLM_API_KEY="your-api-key"
# Scan
diff --git a/docs/integrations/github-actions.mdx b/docs/integrations/github-actions.mdx
index fcc5eb9..827dce0 100644
--- a/docs/integrations/github-actions.mdx
+++ b/docs/integrations/github-actions.mdx
@@ -35,7 +35,7 @@ Add these secrets to your repository:
| Secret | Description |
|--------|-------------|
-| `STRIX_LLM` | Model name (e.g., `anthropic/claude-sonnet-4-6`) |
+| `STRIX_LLM` | Model name (e.g., `openai/gpt-5`) |
| `LLM_API_KEY` | API key for your LLM provider |
## Exit Codes
diff --git a/docs/llm-providers/anthropic.mdx b/docs/llm-providers/anthropic.mdx
index b7b3085..47a94be 100644
--- a/docs/llm-providers/anthropic.mdx
+++ b/docs/llm-providers/anthropic.mdx
@@ -6,7 +6,7 @@ description: "Configure Strix with Claude models"
## Setup
```bash
-export STRIX_LLM="anthropic/claude-sonnet-4-6"
+export STRIX_LLM="openai/gpt-5"
export LLM_API_KEY="sk-ant-..."
```
@@ -14,7 +14,7 @@ export LLM_API_KEY="sk-ant-..."
| Model | Description |
|-------|-------------|
-| `anthropic/claude-sonnet-4-6` | Best balance of intelligence and speed (recommended) |
+| `anthropic/claude-sonnet-4-6` | Best balance of intelligence and speed |
| `anthropic/claude-opus-4-6` | Maximum capability for deep analysis |
## Get API Key
diff --git a/docs/llm-providers/models.mdx b/docs/llm-providers/models.mdx
index 54007a9..6c26da1 100644
--- a/docs/llm-providers/models.mdx
+++ b/docs/llm-providers/models.mdx
@@ -25,7 +25,7 @@ Strix Router is currently in **beta**. It's completely optional — Strix works
```bash
export LLM_API_KEY='your-strix-api-key'
-export STRIX_LLM='strix/claude-sonnet-4.6'
+export STRIX_LLM='strix/gpt-5'
```
3. Run a scan:
diff --git a/docs/llm-providers/overview.mdx b/docs/llm-providers/overview.mdx
index 567af50..b3df76d 100644
--- a/docs/llm-providers/overview.mdx
+++ b/docs/llm-providers/overview.mdx
@@ -10,7 +10,7 @@ Strix uses [LiteLLM](https://docs.litellm.ai/docs/providers) for model compatibi
The fastest way to get started. [Strix Router](/llm-providers/models) gives you access to tested models with the highest rate limits and zero data retention.
```bash
-export STRIX_LLM="strix/claude-sonnet-4.6"
+export STRIX_LLM="strix/gpt-5"
export LLM_API_KEY="your-strix-api-key"
```
@@ -22,12 +22,12 @@ You can also use any LiteLLM-compatible provider with your own API keys:
| Model | Provider | Configuration |
| ----------------- | ------------- | -------------------------------- |
-| Claude Sonnet 4.6 | Anthropic | `anthropic/claude-sonnet-4-6` |
| GPT-5 | OpenAI | `openai/gpt-5` |
+| Claude Sonnet 4.6 | Anthropic | `anthropic/claude-sonnet-4-6` |
| Gemini 3 Pro | Google Vertex | `vertex_ai/gemini-3-pro-preview` |
```bash
-export STRIX_LLM="anthropic/claude-sonnet-4-6"
+export STRIX_LLM="openai/gpt-5"
export LLM_API_KEY="your-api-key"
```
@@ -52,7 +52,7 @@ See the [Local Models guide](/llm-providers/local) for setup instructions and re
GPT-5 and Codex models.
- Claude Sonnet 4.6, Opus, and Haiku.
+ Claude Opus, Sonnet, and Haiku.
Access 100+ models through a single API.
@@ -76,8 +76,8 @@ See the [Local Models guide](/llm-providers/local) for setup instructions and re
Use LiteLLM's `provider/model-name` format:
```
-anthropic/claude-sonnet-4-6
openai/gpt-5
+anthropic/claude-sonnet-4-6
vertex_ai/gemini-3-pro-preview
bedrock/anthropic.claude-4-5-sonnet-20251022-v1:0
ollama/llama4
diff --git a/docs/quickstart.mdx b/docs/quickstart.mdx
index 32eac3d..bd7a8d9 100644
--- a/docs/quickstart.mdx
+++ b/docs/quickstart.mdx
@@ -30,20 +30,20 @@ Set your LLM provider:
```bash
- export STRIX_LLM="strix/claude-sonnet-4.6"
+ export STRIX_LLM="strix/gpt-5"
export LLM_API_KEY="your-strix-api-key"
```
```bash
- export STRIX_LLM="anthropic/claude-sonnet-4-6"
+ export STRIX_LLM="openai/gpt-5"
export LLM_API_KEY="your-api-key"
```
-For best results, use `strix/claude-sonnet-4.6`, `strix/claude-opus-4.6`, or `strix/gpt-5.2`.
+For best results, use `strix/gpt-5`, `strix/claude-opus-4.6`, or `strix/gpt-5.2`.
## Run Your First Scan
diff --git a/scripts/install.sh b/scripts/install.sh
index 7fb158b..67a0e19 100755
--- a/scripts/install.sh
+++ b/scripts/install.sh
@@ -340,7 +340,7 @@ echo -e " ${MUTED}https://models.strix.ai${NC}"
echo ""
echo -e " ${CYAN}2.${NC} Set your environment:"
echo -e " ${MUTED}export LLM_API_KEY='your-api-key'${NC}"
-echo -e " ${MUTED}export STRIX_LLM='strix/claude-sonnet-4.6'${NC}"
+echo -e " ${MUTED}export STRIX_LLM='strix/gpt-5'${NC}"
echo ""
echo -e " ${CYAN}3.${NC} Run a penetration test:"
echo -e " ${MUTED}strix --target https://example.com${NC}"
diff --git a/strix/interface/main.py b/strix/interface/main.py
index f049bbf..2ccf5d8 100644
--- a/strix/interface/main.py
+++ b/strix/interface/main.py
@@ -101,7 +101,7 @@ def validate_environment() -> None: # noqa: PLR0912, PLR0915
error_text.append("• ", style="white")
error_text.append("STRIX_LLM", style="bold cyan")
error_text.append(
- " - Model name to use with litellm (e.g., 'anthropic/claude-sonnet-4-6')\n",
+ " - Model name to use with litellm (e.g., 'openai/gpt-5')\n",
style="white",
)
@@ -141,9 +141,9 @@ def validate_environment() -> None: # noqa: PLR0912, PLR0915
error_text.append("\nExample setup:\n", style="white")
if uses_strix_models:
- error_text.append("export STRIX_LLM='strix/claude-sonnet-4.6'\n", style="dim white")
+ error_text.append("export STRIX_LLM='strix/gpt-5'\n", style="dim white")
else:
- error_text.append("export STRIX_LLM='anthropic/claude-sonnet-4-6'\n", style="dim white")
+ error_text.append("export STRIX_LLM='openai/gpt-5'\n", style="dim white")
if missing_optional_vars:
for var in missing_optional_vars: