--- title: "Configuration" description: "Environment variables for Strix" --- Configure Strix using environment variables or a config file. ## LLM Configuration Model name in LiteLLM format (e.g., `anthropic/claude-sonnet-4-6`, `openai/gpt-5`). API key for your LLM provider. Not required for local models or cloud provider auth (Vertex AI, AWS Bedrock). Custom API base URL. Also accepts `OPENAI_API_BASE`, `LITELLM_BASE_URL`, or `OLLAMA_API_BASE`. Request timeout in seconds for LLM calls. Maximum number of retries for LLM API calls on transient failures. Control thinking effort for reasoning models. Valid values: `none`, `minimal`, `low`, `medium`, `high`, `xhigh`. Defaults to `medium` for quick scan mode. Timeout in seconds for memory compression operations (context summarization). ## Optional Features API key for Perplexity AI. Enables real-time web search during scans for OSINT and vulnerability research. Disable browser automation tools. Enable/disable anonymous telemetry. Set to `0`, `false`, `no`, or `off` to disable. ## Docker Configuration Docker image to use for the sandbox container. Docker daemon socket path. Use for remote Docker hosts or custom configurations. Runtime backend for the sandbox environment. ## Sandbox Configuration Maximum execution time in seconds for sandbox operations. Timeout in seconds for connecting to the sandbox container. ## Config File Strix stores configuration in `~/.strix/cli-config.json`. You can also specify a custom config file: ```bash strix --target ./app --config /path/to/config.json ``` **Config file format:** ```json { "env": { "STRIX_LLM": "anthropic/claude-sonnet-4-6", "LLM_API_KEY": "sk-...", "STRIX_REASONING_EFFORT": "high" } } ``` ## Example Setup ```bash # Required export STRIX_LLM="anthropic/claude-sonnet-4-6" export LLM_API_KEY="sk-..." # Optional: Enable web search export PERPLEXITY_API_KEY="pplx-..." # Optional: Custom timeouts export LLM_TIMEOUT="600" export STRIX_SANDBOX_EXECUTION_TIMEOUT="300" ```