---
title: "Configuration"
description: "Environment variables for Strix"
---
Configure Strix using environment variables or a config file.
## LLM Configuration
Model name in LiteLLM format (e.g., `openai/gpt-5.4`, `anthropic/claude-sonnet-4-6`).
API key for your LLM provider. Not required for local models or cloud provider auth (Vertex AI, AWS Bedrock).
Custom API base URL. Also accepts `OPENAI_API_BASE`, `LITELLM_BASE_URL`, or `OLLAMA_API_BASE`.
Request timeout in seconds for LLM calls.
Maximum number of retries for LLM API calls on transient failures.
Control thinking effort for reasoning models. Valid values: `none`, `minimal`, `low`, `medium`, `high`, `xhigh`. Defaults to `medium` for quick scan mode.
Timeout in seconds for memory compression operations (context summarization).
## Optional Features
API key for Perplexity AI. Enables real-time web search during scans for OSINT and vulnerability research.
Disable browser automation tools.
Global telemetry default toggle. Set to `0`, `false`, `no`, or `off` to disable both PostHog and OTEL unless overridden by per-channel flags below.
Enable/disable OpenTelemetry run observability independently. When unset, falls back to `STRIX_TELEMETRY`.
Enable/disable PostHog product telemetry independently. When unset, falls back to `STRIX_TELEMETRY`.
OTLP/Traceloop base URL for remote OpenTelemetry export. If unset, Strix keeps traces local only.
API key used for remote trace export. Remote export is enabled only when both `TRACELOOP_BASE_URL` and `TRACELOOP_API_KEY` are set.
Optional custom OTEL headers (JSON object or `key=value,key2=value2`). Useful for Langfuse or custom/self-hosted OTLP gateways.
When remote OTEL vars are not set, Strix still writes complete run telemetry locally to:
```bash
strix_runs//events.jsonl
```
When remote vars are set, Strix dual-writes telemetry to both local JSONL and the remote OTEL endpoint.
## Docker Configuration
Docker image to use for the sandbox container.
Docker daemon socket path. Use for remote Docker hosts or custom configurations.
Runtime backend for the sandbox environment.
## Sandbox Configuration
Maximum execution time in seconds for sandbox operations.
Timeout in seconds for connecting to the sandbox container.
## Config File
Strix stores configuration in `~/.strix/cli-config.json`. You can also specify a custom config file:
```bash
strix --target ./app --config /path/to/config.json
```
**Config file format:**
```json
{
"env": {
"STRIX_LLM": "openai/gpt-5.4",
"LLM_API_KEY": "sk-...",
"STRIX_REASONING_EFFORT": "high"
}
}
```
## Example Setup
```bash
# Required
export STRIX_LLM="openai/gpt-5.4"
export LLM_API_KEY="sk-..."
# Optional: Enable web search
export PERPLEXITY_API_KEY="pplx-..."
# Optional: Custom timeouts
export LLM_TIMEOUT="600"
export STRIX_SANDBOX_EXECUTION_TIMEOUT="300"
```