fix: Change default model from claude-sonnet-4-6 to gpt-5 across docs and code
This commit is contained in:
@@ -30,7 +30,7 @@ Thank you for your interest in contributing to Strix! This guide will help you g
|
||||
|
||||
3. **Configure your LLM provider**
|
||||
```bash
|
||||
export STRIX_LLM="anthropic/claude-sonnet-4-6"
|
||||
export STRIX_LLM="openai/gpt-5"
|
||||
export LLM_API_KEY="your-api-key"
|
||||
```
|
||||
|
||||
|
||||
@@ -86,7 +86,7 @@ curl -sSL https://strix.ai/install | bash
|
||||
pipx install strix-agent
|
||||
|
||||
# Configure your AI provider
|
||||
export STRIX_LLM="anthropic/claude-sonnet-4-6" # or "strix/claude-sonnet-4.6" via Strix Router (https://models.strix.ai)
|
||||
export STRIX_LLM="openai/gpt-5" # or "strix/gpt-5" via Strix Router (https://models.strix.ai)
|
||||
export LLM_API_KEY="your-api-key"
|
||||
|
||||
# Run your first security assessment
|
||||
@@ -203,7 +203,7 @@ jobs:
|
||||
### Configuration
|
||||
|
||||
```bash
|
||||
export STRIX_LLM="anthropic/claude-sonnet-4-6"
|
||||
export STRIX_LLM="openai/gpt-5"
|
||||
export LLM_API_KEY="your-api-key"
|
||||
|
||||
# Optional
|
||||
@@ -217,8 +217,8 @@ export STRIX_REASONING_EFFORT="high" # control thinking effort (default: high,
|
||||
|
||||
**Recommended models for best results:**
|
||||
|
||||
- [Anthropic Claude Sonnet 4.6](https://claude.com/platform/api) — `anthropic/claude-sonnet-4-6`
|
||||
- [OpenAI GPT-5](https://openai.com/api/) — `openai/gpt-5`
|
||||
- [Anthropic Claude Sonnet 4.6](https://claude.com/platform/api) — `anthropic/claude-sonnet-4-6`
|
||||
- [Google Gemini 3 Pro Preview](https://cloud.google.com/vertex-ai) — `vertex_ai/gemini-3-pro-preview`
|
||||
|
||||
See the [LLM Providers documentation](https://docs.strix.ai/llm-providers/overview) for all supported providers including Vertex AI, Bedrock, Azure, and local models.
|
||||
|
||||
@@ -8,7 +8,7 @@ Configure Strix using environment variables or a config file.
|
||||
## LLM Configuration
|
||||
|
||||
<ParamField path="STRIX_LLM" type="string" required>
|
||||
Model name in LiteLLM format (e.g., `anthropic/claude-sonnet-4-6`, `openai/gpt-5`).
|
||||
Model name in LiteLLM format (e.g., `openai/gpt-5`, `anthropic/claude-sonnet-4-6`).
|
||||
</ParamField>
|
||||
|
||||
<ParamField path="LLM_API_KEY" type="string">
|
||||
@@ -86,7 +86,7 @@ strix --target ./app --config /path/to/config.json
|
||||
```json
|
||||
{
|
||||
"env": {
|
||||
"STRIX_LLM": "anthropic/claude-sonnet-4-6",
|
||||
"STRIX_LLM": "openai/gpt-5",
|
||||
"LLM_API_KEY": "sk-...",
|
||||
"STRIX_REASONING_EFFORT": "high"
|
||||
}
|
||||
@@ -97,7 +97,7 @@ strix --target ./app --config /path/to/config.json
|
||||
|
||||
```bash
|
||||
# Required
|
||||
export STRIX_LLM="anthropic/claude-sonnet-4-6"
|
||||
export STRIX_LLM="openai/gpt-5"
|
||||
export LLM_API_KEY="sk-..."
|
||||
|
||||
# Optional: Enable web search
|
||||
|
||||
@@ -32,7 +32,7 @@ description: "Contribute to Strix development"
|
||||
</Step>
|
||||
<Step title="Configure LLM">
|
||||
```bash
|
||||
export STRIX_LLM="anthropic/claude-sonnet-4-6"
|
||||
export STRIX_LLM="openai/gpt-5"
|
||||
export LLM_API_KEY="your-api-key"
|
||||
```
|
||||
</Step>
|
||||
|
||||
@@ -78,7 +78,7 @@ Strix uses a graph of specialized agents for comprehensive security testing:
|
||||
curl -sSL https://strix.ai/install | bash
|
||||
|
||||
# Configure
|
||||
export STRIX_LLM="anthropic/claude-sonnet-4-6"
|
||||
export STRIX_LLM="openai/gpt-5"
|
||||
export LLM_API_KEY="your-api-key"
|
||||
|
||||
# Scan
|
||||
|
||||
@@ -35,7 +35,7 @@ Add these secrets to your repository:
|
||||
|
||||
| Secret | Description |
|
||||
|--------|-------------|
|
||||
| `STRIX_LLM` | Model name (e.g., `anthropic/claude-sonnet-4-6`) |
|
||||
| `STRIX_LLM` | Model name (e.g., `openai/gpt-5`) |
|
||||
| `LLM_API_KEY` | API key for your LLM provider |
|
||||
|
||||
## Exit Codes
|
||||
|
||||
@@ -6,7 +6,7 @@ description: "Configure Strix with Claude models"
|
||||
## Setup
|
||||
|
||||
```bash
|
||||
export STRIX_LLM="anthropic/claude-sonnet-4-6"
|
||||
export STRIX_LLM="openai/gpt-5"
|
||||
export LLM_API_KEY="sk-ant-..."
|
||||
```
|
||||
|
||||
@@ -14,7 +14,7 @@ export LLM_API_KEY="sk-ant-..."
|
||||
|
||||
| Model | Description |
|
||||
|-------|-------------|
|
||||
| `anthropic/claude-sonnet-4-6` | Best balance of intelligence and speed (recommended) |
|
||||
| `anthropic/claude-sonnet-4-6` | Best balance of intelligence and speed |
|
||||
| `anthropic/claude-opus-4-6` | Maximum capability for deep analysis |
|
||||
|
||||
## Get API Key
|
||||
|
||||
@@ -25,7 +25,7 @@ Strix Router is currently in **beta**. It's completely optional — Strix works
|
||||
|
||||
```bash
|
||||
export LLM_API_KEY='your-strix-api-key'
|
||||
export STRIX_LLM='strix/claude-sonnet-4.6'
|
||||
export STRIX_LLM='strix/gpt-5'
|
||||
```
|
||||
|
||||
3. Run a scan:
|
||||
|
||||
@@ -10,7 +10,7 @@ Strix uses [LiteLLM](https://docs.litellm.ai/docs/providers) for model compatibi
|
||||
The fastest way to get started. [Strix Router](/llm-providers/models) gives you access to tested models with the highest rate limits and zero data retention.
|
||||
|
||||
```bash
|
||||
export STRIX_LLM="strix/claude-sonnet-4.6"
|
||||
export STRIX_LLM="strix/gpt-5"
|
||||
export LLM_API_KEY="your-strix-api-key"
|
||||
```
|
||||
|
||||
@@ -22,12 +22,12 @@ You can also use any LiteLLM-compatible provider with your own API keys:
|
||||
|
||||
| Model | Provider | Configuration |
|
||||
| ----------------- | ------------- | -------------------------------- |
|
||||
| Claude Sonnet 4.6 | Anthropic | `anthropic/claude-sonnet-4-6` |
|
||||
| GPT-5 | OpenAI | `openai/gpt-5` |
|
||||
| Claude Sonnet 4.6 | Anthropic | `anthropic/claude-sonnet-4-6` |
|
||||
| Gemini 3 Pro | Google Vertex | `vertex_ai/gemini-3-pro-preview` |
|
||||
|
||||
```bash
|
||||
export STRIX_LLM="anthropic/claude-sonnet-4-6"
|
||||
export STRIX_LLM="openai/gpt-5"
|
||||
export LLM_API_KEY="your-api-key"
|
||||
```
|
||||
|
||||
@@ -52,7 +52,7 @@ See the [Local Models guide](/llm-providers/local) for setup instructions and re
|
||||
GPT-5 and Codex models.
|
||||
</Card>
|
||||
<Card title="Anthropic" href="/llm-providers/anthropic">
|
||||
Claude Sonnet 4.6, Opus, and Haiku.
|
||||
Claude Opus, Sonnet, and Haiku.
|
||||
</Card>
|
||||
<Card title="OpenRouter" href="/llm-providers/openrouter">
|
||||
Access 100+ models through a single API.
|
||||
@@ -76,8 +76,8 @@ See the [Local Models guide](/llm-providers/local) for setup instructions and re
|
||||
Use LiteLLM's `provider/model-name` format:
|
||||
|
||||
```
|
||||
anthropic/claude-sonnet-4-6
|
||||
openai/gpt-5
|
||||
anthropic/claude-sonnet-4-6
|
||||
vertex_ai/gemini-3-pro-preview
|
||||
bedrock/anthropic.claude-4-5-sonnet-20251022-v1:0
|
||||
ollama/llama4
|
||||
|
||||
@@ -30,20 +30,20 @@ Set your LLM provider:
|
||||
<Tabs>
|
||||
<Tab title="Strix Router">
|
||||
```bash
|
||||
export STRIX_LLM="strix/claude-sonnet-4.6"
|
||||
export STRIX_LLM="strix/gpt-5"
|
||||
export LLM_API_KEY="your-strix-api-key"
|
||||
```
|
||||
</Tab>
|
||||
<Tab title="Bring Your Own Key">
|
||||
```bash
|
||||
export STRIX_LLM="anthropic/claude-sonnet-4-6"
|
||||
export STRIX_LLM="openai/gpt-5"
|
||||
export LLM_API_KEY="your-api-key"
|
||||
```
|
||||
</Tab>
|
||||
</Tabs>
|
||||
|
||||
<Tip>
|
||||
For best results, use `strix/claude-sonnet-4.6`, `strix/claude-opus-4.6`, or `strix/gpt-5.2`.
|
||||
For best results, use `strix/gpt-5`, `strix/claude-opus-4.6`, or `strix/gpt-5.2`.
|
||||
</Tip>
|
||||
|
||||
## Run Your First Scan
|
||||
|
||||
@@ -340,7 +340,7 @@ echo -e " ${MUTED}https://models.strix.ai${NC}"
|
||||
echo ""
|
||||
echo -e " ${CYAN}2.${NC} Set your environment:"
|
||||
echo -e " ${MUTED}export LLM_API_KEY='your-api-key'${NC}"
|
||||
echo -e " ${MUTED}export STRIX_LLM='strix/claude-sonnet-4.6'${NC}"
|
||||
echo -e " ${MUTED}export STRIX_LLM='strix/gpt-5'${NC}"
|
||||
echo ""
|
||||
echo -e " ${CYAN}3.${NC} Run a penetration test:"
|
||||
echo -e " ${MUTED}strix --target https://example.com${NC}"
|
||||
|
||||
@@ -101,7 +101,7 @@ def validate_environment() -> None: # noqa: PLR0912, PLR0915
|
||||
error_text.append("• ", style="white")
|
||||
error_text.append("STRIX_LLM", style="bold cyan")
|
||||
error_text.append(
|
||||
" - Model name to use with litellm (e.g., 'anthropic/claude-sonnet-4-6')\n",
|
||||
" - Model name to use with litellm (e.g., 'openai/gpt-5')\n",
|
||||
style="white",
|
||||
)
|
||||
|
||||
@@ -141,9 +141,9 @@ def validate_environment() -> None: # noqa: PLR0912, PLR0915
|
||||
|
||||
error_text.append("\nExample setup:\n", style="white")
|
||||
if uses_strix_models:
|
||||
error_text.append("export STRIX_LLM='strix/claude-sonnet-4.6'\n", style="dim white")
|
||||
error_text.append("export STRIX_LLM='strix/gpt-5'\n", style="dim white")
|
||||
else:
|
||||
error_text.append("export STRIX_LLM='anthropic/claude-sonnet-4-6'\n", style="dim white")
|
||||
error_text.append("export STRIX_LLM='openai/gpt-5'\n", style="dim white")
|
||||
|
||||
if missing_optional_vars:
|
||||
for var in missing_optional_vars:
|
||||
|
||||
Reference in New Issue
Block a user