Files
feynman/.env.example
Mochamad Chairulridjal 30d07246d1 feat: add API key and custom provider configuration (#4)
* feat: add API key and custom provider configuration

Previously, model setup only offered OAuth login. This adds:

- API key configuration for 17 built-in providers (OpenAI, Anthropic,
  Google, Mistral, Groq, xAI, OpenRouter, etc.)
- Custom provider setup via models.json (for Ollama, vLLM, LM Studio,
  proxies, or any OpenAI/Anthropic/Google-compatible endpoint)
- Interactive prompts with smart defaults and auto-detection of models
- Verification flow that probes endpoints and provides actionable tips
- Doctor diagnostics for models.json path and missing apiKey warnings
- Dev environment fallback for running without dist/ build artifacts
- Unified auth flow: `feynman model login` now offers both API key
  and OAuth options (OAuth-only when a specific provider is given)

New files:
- src/model/models-json.ts: Read/write models.json with proper merging
- src/model/registry.ts: Centralized ModelRegistry creation with modelsJsonPath
- tests/models-json.test.ts: Unit tests for provider config upsert

* fix: harden runtime env and custom provider auth

---------

Co-authored-by: Advait Paliwal <advaitspaliwal@gmail.com>
2026-03-26 17:09:38 -07:00

27 lines
472 B
Plaintext

# Optional runtime defaults for Feynman.
# Provider credentials are read by pi-coding-agent in the usual ways.
FEYNMAN_MODEL=
FEYNMAN_THINKING=medium
OPENAI_API_KEY=
ANTHROPIC_API_KEY=
GEMINI_API_KEY=
OPENROUTER_API_KEY=
ZAI_API_KEY=
KIMI_API_KEY=
MINIMAX_API_KEY=
MINIMAX_CN_API_KEY=
MISTRAL_API_KEY=
GROQ_API_KEY=
XAI_API_KEY=
CEREBRAS_API_KEY=
HF_TOKEN=
OPENCODE_API_KEY=
AI_GATEWAY_API_KEY=
AZURE_OPENAI_API_KEY=
RUNPOD_API_KEY=
MODAL_TOKEN_ID=
MODAL_TOKEN_SECRET=