feat: add API key and custom provider configuration (#4)
* feat: add API key and custom provider configuration Previously, model setup only offered OAuth login. This adds: - API key configuration for 17 built-in providers (OpenAI, Anthropic, Google, Mistral, Groq, xAI, OpenRouter, etc.) - Custom provider setup via models.json (for Ollama, vLLM, LM Studio, proxies, or any OpenAI/Anthropic/Google-compatible endpoint) - Interactive prompts with smart defaults and auto-detection of models - Verification flow that probes endpoints and provides actionable tips - Doctor diagnostics for models.json path and missing apiKey warnings - Dev environment fallback for running without dist/ build artifacts - Unified auth flow: `feynman model login` now offers both API key and OAuth options (OAuth-only when a specific provider is given) New files: - src/model/models-json.ts: Read/write models.json with proper merging - src/model/registry.ts: Centralized ModelRegistry creation with modelsJsonPath - tests/models-json.test.ts: Unit tests for provider config upsert * fix: harden runtime env and custom provider auth --------- Co-authored-by: Advait Paliwal <advaitspaliwal@gmail.com>
This commit is contained in:
committed by
GitHub
parent
dbd89d8e3d
commit
30d07246d1
14
.env.example
14
.env.example
@@ -6,6 +6,20 @@ FEYNMAN_THINKING=medium
|
||||
|
||||
OPENAI_API_KEY=
|
||||
ANTHROPIC_API_KEY=
|
||||
GEMINI_API_KEY=
|
||||
OPENROUTER_API_KEY=
|
||||
ZAI_API_KEY=
|
||||
KIMI_API_KEY=
|
||||
MINIMAX_API_KEY=
|
||||
MINIMAX_CN_API_KEY=
|
||||
MISTRAL_API_KEY=
|
||||
GROQ_API_KEY=
|
||||
XAI_API_KEY=
|
||||
CEREBRAS_API_KEY=
|
||||
HF_TOKEN=
|
||||
OPENCODE_API_KEY=
|
||||
AI_GATEWAY_API_KEY=
|
||||
AZURE_OPENAI_API_KEY=
|
||||
|
||||
RUNPOD_API_KEY=
|
||||
MODAL_TOKEN_ID=
|
||||
|
||||
Reference in New Issue
Block a user