fix: update Pi and model provider flows

This commit is contained in:
Advait Paliwal
2026-04-12 13:02:16 -07:00
parent b3a82d4a92
commit aa96b5ee14
14 changed files with 273 additions and 83 deletions

View File

@@ -28,7 +28,7 @@ Feynman supports multiple model providers. The setup wizard presents a list of a
google:gemini-2.5-pro
```
The model you choose here becomes the default for all sessions. You can override it per-session with the `--model` flag or change it later via `feynman model set <provider:model>`.
The model you choose here becomes the default for all sessions. You can override it per-session with the `--model` flag or change it later via `feynman model set <provider/model>` or `feynman model set <provider:model>`.
## Stage 2: Authentication
@@ -42,6 +42,16 @@ For API key providers, you are prompted to paste your key directly:
Keys are encrypted at rest and never sent anywhere except the provider's API endpoint.
### Amazon Bedrock
For Amazon Bedrock, choose:
```text
Amazon Bedrock (AWS credential chain)
```
Feynman verifies the same AWS credential chain Pi uses at runtime, including `AWS_PROFILE`, `~/.aws` credentials/config, SSO, ECS/IRSA, and EC2 instance roles. Once that check passes, Bedrock models become available in `feynman model list` without needing a traditional API key.
### Local models: Ollama, LM Studio, vLLM
If you want to use a model running locally, choose the API-key flow and then select: