62 lines
1.7 KiB
Plaintext
62 lines
1.7 KiB
Plaintext
---
|
|
title: "Overview"
|
|
description: "Configure your AI model for Strix"
|
|
---
|
|
|
|
Strix uses [LiteLLM](https://docs.litellm.ai/docs/providers) for model compatibility, supporting 100+ LLM providers.
|
|
|
|
## Recommended Models
|
|
|
|
For best results, use one of these models:
|
|
|
|
| Model | Provider | Configuration |
|
|
| ----------------- | ------------- | -------------------------------- |
|
|
| GPT-5 | OpenAI | `openai/gpt-5` |
|
|
| Claude 4.5 Sonnet | Anthropic | `anthropic/claude-sonnet-4-5` |
|
|
| Gemini 3 Pro | Google Vertex | `vertex_ai/gemini-3-pro-preview` |
|
|
|
|
## Quick Setup
|
|
|
|
```bash
|
|
export STRIX_LLM="openai/gpt-5"
|
|
export LLM_API_KEY="your-api-key"
|
|
```
|
|
|
|
## Provider Guides
|
|
|
|
<CardGroup cols={2}>
|
|
<Card title="OpenAI" href="/llm-providers/openai">
|
|
GPT-5 and Codex models.
|
|
</Card>
|
|
<Card title="Anthropic" href="/llm-providers/anthropic">
|
|
Claude 4.5 Sonnet, Opus, and Haiku.
|
|
</Card>
|
|
<Card title="OpenRouter" href="/llm-providers/openrouter">
|
|
Access 100+ models through a single API.
|
|
</Card>
|
|
<Card title="Google Vertex AI" href="/llm-providers/vertex">
|
|
Gemini 3 models via Google Cloud.
|
|
</Card>
|
|
<Card title="AWS Bedrock" href="/llm-providers/bedrock">
|
|
Claude 4.5 and Titan models via AWS.
|
|
</Card>
|
|
<Card title="Azure OpenAI" href="/llm-providers/azure">
|
|
GPT-5 via Azure.
|
|
</Card>
|
|
<Card title="Local Models" href="/llm-providers/local">
|
|
Llama 4, Mistral, and self-hosted models.
|
|
</Card>
|
|
</CardGroup>
|
|
|
|
## Model Format
|
|
|
|
Use LiteLLM's `provider/model-name` format:
|
|
|
|
```
|
|
openai/gpt-5
|
|
anthropic/claude-sonnet-4-5
|
|
vertex_ai/gemini-3-pro-preview
|
|
bedrock/anthropic.claude-4-5-sonnet-20251022-v1:0
|
|
ollama/llama4
|
|
```
|