diff --git a/README.md b/README.md index 2b3ff64..9d3b7ef 100644 --- a/README.md +++ b/README.md @@ -207,7 +207,7 @@ export LLM_API_BASE="your-api-base-url" # if using a local model, e.g. Ollama, export PERPLEXITY_API_KEY="your-api-key" # for search capabilities ``` -[OpenAI's GPT-5](https://openai.com/api/) (`openai/gpt-5`) and [Anthropic's Claude Sonnet 4.5](https://claude.com/platform/api) (`anthropic/claude-sonnet-4-5`) work best with Strix, but we support many [other options](https://docs.litellm.ai/docs/providers). +[OpenAI's GPT-5](https://openai.com/api/) (`openai/gpt-5`) and [Anthropic's Claude Sonnet 4.5](https://claude.com/platform/api) (`anthropic/claude-sonnet-4-5`) are the recommended models for best results with Strix. We also support many [other options](https://docs.litellm.ai/docs/providers), including cloud and local models, though their performance and reliability may vary. ## 🤝 Contributing