fix: update Pi and model provider flows

This commit is contained in:
Advait Paliwal
2026-04-12 13:02:16 -07:00
parent b3a82d4a92
commit aa96b5ee14
14 changed files with 273 additions and 83 deletions

View File

@@ -167,3 +167,22 @@ Use this file to track chronology, not release notes. Keep entries short, factua
- Failed / learned: Website typecheck was previously a no-op prompt because `@astrojs/check` was missing; installing it exposed dev-audit findings that needed explicit overrides before the full website audit was clean. - Failed / learned: Website typecheck was previously a no-op prompt because `@astrojs/check` was missing; installing it exposed dev-audit findings that needed explicit overrides before the full website audit was clean.
- Blockers: Docker Desktop remained unreliable after restart attempts, so this pass still does not include a second successful public-installer Linux Docker run. - Blockers: Docker Desktop remained unreliable after restart attempts, so this pass still does not include a second successful public-installer Linux Docker run.
- Next: Push the RPC/website verification commit and keep future Docker/public-installer validation separate from repo correctness unless Docker is stable. - Next: Push the RPC/website verification commit and keep future Docker/public-installer validation separate from repo correctness unless Docker is stable.
### 2026-04-12 09:32 PDT — pi-0.66.1-upgrade-pass
- Objective: Update Feynman from Pi `0.64.0` to the current `0.66.1` packages and absorb any downstream SDK/runtime compatibility changes instead of leaving the repo pinned behind upstream.
- Changed: Bumped `@mariozechner/pi-ai` and `@mariozechner/pi-coding-agent` to `0.66.1` plus `@companion-ai/alpha-hub` to `0.1.3` in `package.json` and `package-lock.json`; updated `extensions/research-tools.ts` to stop listening for the removed `session_switch` extension event and rely on `session_start`, which now carries startup/reload/new/resume/fork reasons in Pi `0.66.x`.
- Verified: Ran `npm test`, `npm run typecheck`, and `npm run build` successfully after the upgrade; smoke-ran `node bin/feynman.js --version`, `node bin/feynman.js doctor`, and `node bin/feynman.js status` successfully; checked upstream package diffs and confirmed the breaking change that affected this repo was the typed extension lifecycle change in `pi-coding-agent`, while `pi-ai` mainly brought refreshed provider/model catalog code including Bedrock/OpenAI provider updates and new generated model entries.
- Failed / learned: `ctx7` resolved Pi correctly to `/badlogic/pi-mono`, but its docs snapshot was not release-note oriented; the concrete downstream-impact analysis came from the actual `0.64.0``0.66.1` package diffs and local validation, not from prose docs alone.
- Failed / learned: The first post-upgrade CLI smoke test failed before Feynman startup because `@companion-ai/alpha-hub@0.1.2` shipped a zero-byte `src/lib/auth.js`; bumping to `0.1.3` fixed that adjacent runtime blocker.
- Blockers: `npm install` reports two high-severity vulnerabilities remain in the dependency tree; this pass focused on the Pi upgrade and did not remediate unrelated audit findings.
- Next: Push the Pi upgrade, then decide whether to layer the pending model-command fixes on top of this branch or land them separately to keep the dependency bump easy to review.
### 2026-04-12 13:00 PDT — model-command-and-bedrock-fix-pass
- Objective: Finish the remaining user-facing model-management regressions instead of stopping at the Pi dependency bump.
- Changed: Updated `src/model/commands.ts` so `feynman model login <provider>` resolves both OAuth and API-key providers; `feynman model logout <provider>` clears either auth mode; `feynman model set` accepts both `provider/model` and `provider:model`; ambiguous bare model IDs now prefer explicitly configured providers from auth storage; added an `amazon-bedrock` setup path that validates the AWS credential chain with the AWS SDK and stores Pi's `<authenticated>` sentinel so Bedrock models appear in `model list`; synced `src/cli.ts`, `metadata/commands.mjs`, `README.md`, and the website docs to the new behavior.
- Verified: Added regression tests in `tests/model-harness.test.ts` for `provider:model`, API-key provider resolution, and ambiguous bare-ID handling; ran `npm test`, `npm run typecheck`, `npm run build`, and `cd website && npm run build`; exercised command-level flows against throwaway `FEYNMAN_HOME` directories: interactive `node bin/feynman.js model login google`, `node bin/feynman.js model set google:gemini-3-pro-preview`, `node bin/feynman.js model set gpt-5.4` with only OpenAI configured, and `node bin/feynman.js model login amazon-bedrock`; confirmed `model list` shows Bedrock models after the new setup path; ran a live one-shot prompt `node bin/feynman.js --prompt "Reply with exactly OK"` and got `OK`.
- Failed / learned: The website build still emits duplicate-id warnings for a handful of docs pages, but it completes successfully; those warnings predate this pass and were not introduced by the model-command edits.
- Blockers: The Bedrock path is verified with the current shell's AWS credential chain, not with a fresh machine lacking AWS config; broader upstream Pi behavior around IMDS/default-profile autodiscovery without the sentinel is still outside this repo.
- Next: Commit and push the combined Pi/model/docs maintenance branch, then decide whether to tackle the deeper search/deepresearch hang issues separately or leave them for focused repro work.

View File

@@ -29,6 +29,8 @@ The one-line installer fetches the latest tagged release. To pin a version, pass
The installer downloads a standalone native bundle with its own Node.js runtime. The installer downloads a standalone native bundle with its own Node.js runtime.
To upgrade the standalone app later, rerun the installer. `feynman update` only refreshes installed Pi packages inside Feynman's environment; it does not replace the standalone runtime bundle itself.
Local models are supported through the custom-provider flow. For Ollama, run `feynman setup`, choose `Custom provider (baseUrl + API key)`, use `openai-completions`, and point it at `http://localhost:11434/v1`. Local models are supported through the custom-provider flow. For Ollama, run `feynman setup`, choose `Custom provider (baseUrl + API key)`, use `openai-completions`, and point it at `http://localhost:11434/v1`.
### Skills Only ### Skills Only

View File

@@ -11,14 +11,11 @@ import { registerServiceTierControls } from "./research-tools/service-tier.js";
export default function researchTools(pi: ExtensionAPI): void { export default function researchTools(pi: ExtensionAPI): void {
const cache: { agentSummaryPromise?: Promise<{ agents: string[]; chains: string[] }> } = {}; const cache: { agentSummaryPromise?: Promise<{ agents: string[]; chains: string[] }> } = {};
// Pi 0.66.x folds post-switch/resume lifecycle into session_start.
pi.on("session_start", async (_event, ctx) => { pi.on("session_start", async (_event, ctx) => {
await installFeynmanHeader(pi, ctx, cache); await installFeynmanHeader(pi, ctx, cache);
}); });
pi.on("session_switch", async (_event, ctx) => {
await installFeynmanHeader(pi, ctx, cache);
});
registerAlphaTools(pi); registerAlphaTools(pi);
registerDiscoveryCommands(pi); registerDiscoveryCommands(pi);
registerFeynmanModelCommand(pi); registerFeynmanModelCommand(pi);

View File

@@ -86,9 +86,9 @@ export const cliCommandSections = [
title: "Model Management", title: "Model Management",
commands: [ commands: [
{ usage: "feynman model list", description: "List available models in Pi auth storage." }, { usage: "feynman model list", description: "List available models in Pi auth storage." },
{ usage: "feynman model login [id]", description: "Login to a Pi OAuth model provider." }, { usage: "feynman model login [id]", description: "Authenticate a model provider with OAuth or API-key setup." },
{ usage: "feynman model logout [id]", description: "Logout from a Pi OAuth model provider." }, { usage: "feynman model logout [id]", description: "Clear stored auth for a model provider." },
{ usage: "feynman model set <provider/model>", description: "Set the default model." }, { usage: "feynman model set <provider/model>", description: "Set the default model (also accepts provider:model)." },
{ usage: "feynman model tier [value]", description: "View or set the request service tier override." }, { usage: "feynman model tier [value]", description: "View or set the request service tier override." },
], ],
}, },
@@ -118,7 +118,7 @@ export const legacyFlags = [
{ usage: "--alpha-login", description: "Sign in to alphaXiv and exit." }, { usage: "--alpha-login", description: "Sign in to alphaXiv and exit." },
{ usage: "--alpha-logout", description: "Clear alphaXiv auth and exit." }, { usage: "--alpha-logout", description: "Clear alphaXiv auth and exit." },
{ usage: "--alpha-status", description: "Show alphaXiv auth status and exit." }, { usage: "--alpha-status", description: "Show alphaXiv auth status and exit." },
{ usage: "--model <provider:model>", description: "Force a specific model." }, { usage: "--model <provider/model|provider:model>", description: "Force a specific model." },
{ usage: "--service-tier <tier>", description: "Override request service tier for this run." }, { usage: "--service-tier <tier>", description: "Override request service tier for this run." },
{ usage: "--thinking <level>", description: "Set thinking level: off | minimal | low | medium | high | xhigh." }, { usage: "--thinking <level>", description: "Set thinking level: off | minimal | low | medium | high | xhigh." },
{ usage: "--cwd <path>", description: "Set the working directory for tools." }, { usage: "--cwd <path>", description: "Set the working directory for tools." },

50
package-lock.json generated
View File

@@ -10,9 +10,9 @@
"hasInstallScript": true, "hasInstallScript": true,
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@companion-ai/alpha-hub": "^0.1.2", "@companion-ai/alpha-hub": "^0.1.3",
"@mariozechner/pi-ai": "^0.64.0", "@mariozechner/pi-ai": "^0.66.1",
"@mariozechner/pi-coding-agent": "^0.64.0", "@mariozechner/pi-coding-agent": "^0.66.1",
"@sinclair/typebox": "^0.34.48", "@sinclair/typebox": "^0.34.48",
"dotenv": "^17.3.1" "dotenv": "^17.3.1"
}, },
@@ -781,9 +781,9 @@
} }
}, },
"node_modules/@companion-ai/alpha-hub": { "node_modules/@companion-ai/alpha-hub": {
"version": "0.1.2", "version": "0.1.3",
"resolved": "https://registry.npmjs.org/@companion-ai/alpha-hub/-/alpha-hub-0.1.2.tgz", "resolved": "https://registry.npmjs.org/@companion-ai/alpha-hub/-/alpha-hub-0.1.3.tgz",
"integrity": "sha512-YAFh4B6loo7lKRjW3UFsdoiW3ZRvLdSdP7liDsHhCxY1dzfbxNU8vDAloodiK4ieDVRqMBTmG9NYbnsb4NZUGw==", "integrity": "sha512-g/JoqeGDCoSvkgs1ZSTYJhbTak0zVanQyoYOvf2tDgfqJ09gfkqmSGFDmiP4PkTn1bocPqywZIABgmv25x1uYA==",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@modelcontextprotocol/sdk": "^1.27.1", "@modelcontextprotocol/sdk": "^1.27.1",
@@ -1469,21 +1469,21 @@
} }
}, },
"node_modules/@mariozechner/pi-agent-core": { "node_modules/@mariozechner/pi-agent-core": {
"version": "0.64.0", "version": "0.66.1",
"resolved": "https://registry.npmjs.org/@mariozechner/pi-agent-core/-/pi-agent-core-0.64.0.tgz", "resolved": "https://registry.npmjs.org/@mariozechner/pi-agent-core/-/pi-agent-core-0.66.1.tgz",
"integrity": "sha512-IN/sIxWOD0v1OFVXHB605SGiZhO5XdEWG5dO8EAV08n3jz/p12o4OuYGvhGXmHhU28WXa/FGWC+FO5xiIih8Uw==", "integrity": "sha512-Nj54A7SuB/EQi8r3Gs+glFOr9wz/a9uxYFf0pCLf2DE7VmzA9O7WSejrvArna17K6auftLSdNyRRe2bIO0qezg==",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@mariozechner/pi-ai": "^0.64.0" "@mariozechner/pi-ai": "^0.66.1"
}, },
"engines": { "engines": {
"node": ">=20.0.0" "node": ">=20.0.0"
} }
}, },
"node_modules/@mariozechner/pi-ai": { "node_modules/@mariozechner/pi-ai": {
"version": "0.64.0", "version": "0.66.1",
"resolved": "https://registry.npmjs.org/@mariozechner/pi-ai/-/pi-ai-0.64.0.tgz", "resolved": "https://registry.npmjs.org/@mariozechner/pi-ai/-/pi-ai-0.66.1.tgz",
"integrity": "sha512-Z/Jnf+JSVDPLRcxJsa8XhYTJKIqKekNueaCpBLGQHgizL1F9RQ1Rur3rIfZpfXkt2cLu/AIPtOs223ueuoWaWg==", "integrity": "sha512-7IZHvpsFdKEBkTmjNrdVL7JLUJVIpha6bwTr12cZ5XyDrxij06wP6Ncpnf4HT5BXAzD5w2JnoqTOSbMEIZj3dg==",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@anthropic-ai/sdk": "^0.73.0", "@anthropic-ai/sdk": "^0.73.0",
@@ -1508,15 +1508,15 @@
} }
}, },
"node_modules/@mariozechner/pi-coding-agent": { "node_modules/@mariozechner/pi-coding-agent": {
"version": "0.64.0", "version": "0.66.1",
"resolved": "https://registry.npmjs.org/@mariozechner/pi-coding-agent/-/pi-coding-agent-0.64.0.tgz", "resolved": "https://registry.npmjs.org/@mariozechner/pi-coding-agent/-/pi-coding-agent-0.66.1.tgz",
"integrity": "sha512-Q4tcqSqFGQtOgCtRyIp1D80Nv2if13Q2pfbnrOlaT/mix90mLcZGML9jKVnT1jGSy5GMYudU1HsS7cx53kxb0g==", "integrity": "sha512-cNmatT+5HvYzQ78cRhRih00wCeUTH/fFx9ecJh5AbN7axgWU+bwiZYy0cjrTsGVgMGF4xMYlPRn/Nze9JEB+/w==",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@mariozechner/jiti": "^2.6.2", "@mariozechner/jiti": "^2.6.2",
"@mariozechner/pi-agent-core": "^0.64.0", "@mariozechner/pi-agent-core": "^0.66.1",
"@mariozechner/pi-ai": "^0.64.0", "@mariozechner/pi-ai": "^0.66.1",
"@mariozechner/pi-tui": "^0.64.0", "@mariozechner/pi-tui": "^0.66.1",
"@silvia-odwyer/photon-node": "^0.3.4", "@silvia-odwyer/photon-node": "^0.3.4",
"ajv": "^8.17.1", "ajv": "^8.17.1",
"chalk": "^5.5.0", "chalk": "^5.5.0",
@@ -1545,9 +1545,9 @@
} }
}, },
"node_modules/@mariozechner/pi-tui": { "node_modules/@mariozechner/pi-tui": {
"version": "0.64.0", "version": "0.66.1",
"resolved": "https://registry.npmjs.org/@mariozechner/pi-tui/-/pi-tui-0.64.0.tgz", "resolved": "https://registry.npmjs.org/@mariozechner/pi-tui/-/pi-tui-0.66.1.tgz",
"integrity": "sha512-W1qLry9MAuN/V3YJmMv/BJa0VaYv721NkXPg/DGItdqWxuDc+1VdNbyAnRwxblNkIpXVUWL26x64BlyFXpxmkg==", "integrity": "sha512-hNFN42ebjwtfGooqoUwM+QaPR1XCyqPuueuP3aLOWS1bZ2nZP/jq8MBuGNrmMw1cgiDcotvOlSNj3BatzEOGsw==",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@types/mime-types": "^2.1.4", "@types/mime-types": "^2.1.4",
@@ -3844,9 +3844,9 @@
} }
}, },
"node_modules/koffi": { "node_modules/koffi": {
"version": "2.15.2", "version": "2.15.6",
"resolved": "https://registry.npmjs.org/koffi/-/koffi-2.15.2.tgz", "resolved": "https://registry.npmjs.org/koffi/-/koffi-2.15.6.tgz",
"integrity": "sha512-r9tjJLVRSOhCRWdVyQlF3/Ugzeg13jlzS4czS82MAgLff4W+BcYOW7g8Y62t9O5JYjYOLAjAovAZDNlDfZNu+g==", "integrity": "sha512-WQBpM5uo74UQ17UpsFN+PUOrQQg4/nYdey4SGVluQun2drYYfePziLLWdSmFb4wSdWlJC1aimXQnjhPCheRKuw==",
"hasInstallScript": true, "hasInstallScript": true,
"license": "MIT", "license": "MIT",
"optional": true, "optional": true,

View File

@@ -59,9 +59,9 @@
] ]
}, },
"dependencies": { "dependencies": {
"@companion-ai/alpha-hub": "^0.1.2", "@companion-ai/alpha-hub": "^0.1.3",
"@mariozechner/pi-ai": "^0.64.0", "@mariozechner/pi-ai": "^0.66.1",
"@mariozechner/pi-coding-agent": "^0.64.0", "@mariozechner/pi-coding-agent": "^0.66.1",
"@sinclair/typebox": "^0.34.48", "@sinclair/typebox": "^0.34.48",
"dotenv": "^17.3.1" "dotenv": "^17.3.1"
}, },

View File

@@ -130,7 +130,7 @@ async function handleModelCommand(subcommand: string | undefined, args: string[]
if (subcommand === "login") { if (subcommand === "login") {
if (args[0]) { if (args[0]) {
// Specific provider given - use OAuth login directly // Specific provider given - resolve OAuth vs API-key setup automatically
await loginModelProvider(feynmanAuthPath, args[0], feynmanSettingsPath); await loginModelProvider(feynmanAuthPath, args[0], feynmanSettingsPath);
} else { } else {
// No provider specified - show auth method choice // No provider specified - show auth method choice
@@ -147,7 +147,7 @@ async function handleModelCommand(subcommand: string | undefined, args: string[]
if (subcommand === "set") { if (subcommand === "set") {
const spec = args[0]; const spec = args[0];
if (!spec) { if (!spec) {
throw new Error("Usage: feynman model set <provider/model>"); throw new Error("Usage: feynman model set <provider/model|provider:model>");
} }
setDefaultModelSpec(feynmanSettingsPath, feynmanAuthPath, spec); setDefaultModelSpec(feynmanSettingsPath, feynmanAuthPath, spec);
return; return;

View File

@@ -75,6 +75,7 @@ const API_KEY_PROVIDERS: ApiKeyProviderInfo[] = [
{ id: "openai", label: "OpenAI Platform API", envVar: "OPENAI_API_KEY" }, { id: "openai", label: "OpenAI Platform API", envVar: "OPENAI_API_KEY" },
{ id: "anthropic", label: "Anthropic API", envVar: "ANTHROPIC_API_KEY" }, { id: "anthropic", label: "Anthropic API", envVar: "ANTHROPIC_API_KEY" },
{ id: "google", label: "Google Gemini API", envVar: "GEMINI_API_KEY" }, { id: "google", label: "Google Gemini API", envVar: "GEMINI_API_KEY" },
{ id: "amazon-bedrock", label: "Amazon Bedrock (AWS credential chain)" },
{ id: "openrouter", label: "OpenRouter", envVar: "OPENROUTER_API_KEY" }, { id: "openrouter", label: "OpenRouter", envVar: "OPENROUTER_API_KEY" },
{ id: "zai", label: "Z.AI / GLM", envVar: "ZAI_API_KEY" }, { id: "zai", label: "Z.AI / GLM", envVar: "ZAI_API_KEY" },
{ id: "kimi-coding", label: "Kimi / Moonshot", envVar: "KIMI_API_KEY" }, { id: "kimi-coding", label: "Kimi / Moonshot", envVar: "KIMI_API_KEY" },
@@ -91,6 +92,31 @@ const API_KEY_PROVIDERS: ApiKeyProviderInfo[] = [
{ id: "azure-openai-responses", label: "Azure OpenAI (Responses)", envVar: "AZURE_OPENAI_API_KEY" }, { id: "azure-openai-responses", label: "Azure OpenAI (Responses)", envVar: "AZURE_OPENAI_API_KEY" },
]; ];
function resolveApiKeyProvider(input: string): ApiKeyProviderInfo | undefined {
const normalizedInput = normalizeProviderId(input);
if (!normalizedInput) {
return undefined;
}
return API_KEY_PROVIDERS.find((provider) => provider.id === normalizedInput);
}
export function resolveModelProviderForCommand(
authPath: string,
input: string,
): { kind: "oauth" | "api-key"; id: string } | undefined {
const oauthProvider = resolveOAuthProvider(authPath, input);
if (oauthProvider) {
return { kind: "oauth", id: oauthProvider.id };
}
const apiKeyProvider = resolveApiKeyProvider(input);
if (apiKeyProvider) {
return { kind: "api-key", id: apiKeyProvider.id };
}
return undefined;
}
async function selectApiKeyProvider(): Promise<ApiKeyProviderInfo | undefined> { async function selectApiKeyProvider(): Promise<ApiKeyProviderInfo | undefined> {
const choices = API_KEY_PROVIDERS.map( const choices = API_KEY_PROVIDERS.map(
(provider) => `${provider.id}${provider.label}${provider.envVar ? ` (${provider.envVar})` : ""}`, (provider) => `${provider.id}${provider.label}${provider.envVar ? ` (${provider.envVar})` : ""}`,
@@ -447,13 +473,66 @@ async function verifyCustomProvider(setup: CustomProviderSetup, authPath: string
printInfo("Verification: skipped network probe for this API mode."); printInfo("Verification: skipped network probe for this API mode.");
} }
async function configureApiKeyProvider(authPath: string): Promise<boolean> { async function verifyBedrockCredentialChain(): Promise<void> {
const provider = await selectApiKeyProvider(); const { defaultProvider } = await import("@aws-sdk/credential-provider-node");
const credentials = await defaultProvider({})();
if (!credentials?.accessKeyId || !credentials?.secretAccessKey) {
throw new Error("AWS credential chain resolved without usable Bedrock credentials.");
}
}
async function configureBedrockProvider(authPath: string): Promise<boolean> {
printSection("AWS Credentials: Amazon Bedrock");
printInfo("Feynman will verify the AWS SDK credential chain used by Pi's Bedrock provider.");
printInfo("Supported sources include AWS_PROFILE, ~/.aws credentials/config, SSO, ECS/IRSA, and EC2 instance roles.");
try {
await verifyBedrockCredentialChain();
AuthStorage.create(authPath).set("amazon-bedrock", { type: "api_key", key: "<authenticated>" });
printSuccess("Verified AWS credential chain and marked Amazon Bedrock as configured.");
printInfo("Use `feynman model list` to see available Bedrock models.");
return true;
} catch (error) {
printWarning(`AWS credential verification failed: ${error instanceof Error ? error.message : String(error)}`);
printInfo("Configure AWS credentials first, for example:");
printInfo(" export AWS_PROFILE=default");
printInfo(" # or set AWS_ACCESS_KEY_ID / AWS_SECRET_ACCESS_KEY");
printInfo(" # or use an EC2/ECS/IRSA role with valid Bedrock access");
return false;
}
}
function maybeSetRecommendedDefaultModel(settingsPath: string | undefined, authPath: string): void {
if (!settingsPath) {
return;
}
const currentSpec = getCurrentModelSpec(settingsPath);
const available = getAvailableModelRecords(authPath);
const currentValid = currentSpec ? available.some((m) => `${m.provider}/${m.id}` === currentSpec) : false;
if ((!currentSpec || !currentValid) && available.length > 0) {
const recommended = chooseRecommendedModel(authPath);
if (recommended) {
setDefaultModelSpec(settingsPath, authPath, recommended.spec);
}
}
}
async function configureApiKeyProvider(authPath: string, providerId?: string): Promise<boolean> {
const provider = providerId ? resolveApiKeyProvider(providerId) : await selectApiKeyProvider();
if (!provider) { if (!provider) {
if (providerId) {
throw new Error(`Unknown API-key model provider: ${providerId}`);
}
printInfo("API key setup cancelled."); printInfo("API key setup cancelled.");
return false; return false;
} }
if (provider.id === "amazon-bedrock") {
return configureBedrockProvider(authPath);
}
if (provider.id === "__custom__") { if (provider.id === "__custom__") {
const setup = await promptCustomProviderSetup(); const setup = await promptCustomProviderSetup();
if (!setup) { if (!setup) {
@@ -512,7 +591,7 @@ async function configureApiKeyProvider(authPath: string): Promise<boolean> {
} }
function resolveAvailableModelSpec(authPath: string, input: string): string | undefined { function resolveAvailableModelSpec(authPath: string, input: string): string | undefined {
const normalizedInput = input.trim().toLowerCase(); const normalizedInput = input.trim().replace(/^([^/:]+):(.+)$/, "$1/$2").toLowerCase();
if (!normalizedInput) { if (!normalizedInput) {
return undefined; return undefined;
} }
@@ -528,6 +607,17 @@ function resolveAvailableModelSpec(authPath: string, input: string): string | un
return `${exactIdMatches[0]!.provider}/${exactIdMatches[0]!.id}`; return `${exactIdMatches[0]!.provider}/${exactIdMatches[0]!.id}`;
} }
// When multiple providers expose the same bare model ID, prefer providers the
// user explicitly configured in auth storage.
if (exactIdMatches.length > 1) {
const authData = readJson(authPath) as Record<string, unknown>;
const configuredProviders = new Set(Object.keys(authData));
const configuredMatches = exactIdMatches.filter((model) => configuredProviders.has(model.provider));
if (configuredMatches.length === 1) {
return `${configuredMatches[0]!.provider}/${configuredMatches[0]!.id}`;
}
}
return undefined; return undefined;
} }
@@ -574,16 +664,8 @@ export async function authenticateModelProvider(authPath: string, settingsPath?:
if (selection === 0) { if (selection === 0) {
const configured = await configureApiKeyProvider(authPath); const configured = await configureApiKeyProvider(authPath);
if (configured && settingsPath) { if (configured) {
const currentSpec = getCurrentModelSpec(settingsPath); maybeSetRecommendedDefaultModel(settingsPath, authPath);
const available = getAvailableModelRecords(authPath);
const currentValid = currentSpec ? available.some((m) => `${m.provider}/${m.id}` === currentSpec) : false;
if ((!currentSpec || !currentValid) && available.length > 0) {
const recommended = chooseRecommendedModel(authPath);
if (recommended) {
setDefaultModelSpec(settingsPath, authPath, recommended.spec);
}
}
} }
return configured; return configured;
} }
@@ -597,10 +679,24 @@ export async function authenticateModelProvider(authPath: string, settingsPath?:
} }
export async function loginModelProvider(authPath: string, providerId?: string, settingsPath?: string): Promise<boolean> { export async function loginModelProvider(authPath: string, providerId?: string, settingsPath?: string): Promise<boolean> {
if (providerId) {
const resolvedProvider = resolveModelProviderForCommand(authPath, providerId);
if (!resolvedProvider) {
throw new Error(`Unknown model provider: ${providerId}`);
}
if (resolvedProvider.kind === "api-key") {
const configured = await configureApiKeyProvider(authPath, resolvedProvider.id);
if (configured) {
maybeSetRecommendedDefaultModel(settingsPath, authPath);
}
return configured;
}
}
const provider = providerId ? resolveOAuthProvider(authPath, providerId) : await selectOAuthProvider(authPath, "login"); const provider = providerId ? resolveOAuthProvider(authPath, providerId) : await selectOAuthProvider(authPath, "login");
if (!provider) { if (!provider) {
if (providerId) { if (providerId) {
throw new Error(`Unknown OAuth model provider: ${providerId}`); throw new Error(`Unknown model provider: ${providerId}`);
} }
printInfo("Login cancelled."); printInfo("Login cancelled.");
return false; return false;
@@ -637,35 +733,38 @@ export async function loginModelProvider(authPath: string, providerId?: string,
printSuccess(`Model provider login complete: ${provider.id}`); printSuccess(`Model provider login complete: ${provider.id}`);
if (settingsPath) { maybeSetRecommendedDefaultModel(settingsPath, authPath);
const currentSpec = getCurrentModelSpec(settingsPath);
const available = getAvailableModelRecords(authPath);
const currentValid = currentSpec
? available.some((m) => `${m.provider}/${m.id}` === currentSpec)
: false;
if ((!currentSpec || !currentValid) && available.length > 0) {
const recommended = chooseRecommendedModel(authPath);
if (recommended) {
setDefaultModelSpec(settingsPath, authPath, recommended.spec);
}
}
}
return true; return true;
} }
export async function logoutModelProvider(authPath: string, providerId?: string): Promise<void> { export async function logoutModelProvider(authPath: string, providerId?: string): Promise<void> {
const provider = providerId ? resolveOAuthProvider(authPath, providerId) : await selectOAuthProvider(authPath, "logout"); const authStorage = AuthStorage.create(authPath);
if (!provider) {
if (providerId) { if (providerId) {
throw new Error(`Unknown OAuth model provider: ${providerId}`); const resolvedProvider = resolveModelProviderForCommand(authPath, providerId);
if (resolvedProvider) {
authStorage.logout(resolvedProvider.id);
printSuccess(`Model provider logout complete: ${resolvedProvider.id}`);
return;
} }
const normalizedProviderId = normalizeProviderId(providerId);
if (authStorage.has(normalizedProviderId)) {
authStorage.logout(normalizedProviderId);
printSuccess(`Model provider logout complete: ${normalizedProviderId}`);
return;
}
throw new Error(`Unknown model provider: ${providerId}`);
}
const provider = await selectOAuthProvider(authPath, "logout");
if (!provider) {
printInfo("Logout cancelled."); printInfo("Logout cancelled.");
return; return;
} }
AuthStorage.create(authPath).logout(provider.id); authStorage.logout(provider.id);
printSuccess(`Model provider logout complete: ${provider.id}`); printSuccess(`Model provider logout complete: ${provider.id}`);
} }

View File

@@ -6,7 +6,7 @@ import { join } from "node:path";
import { resolveInitialPrompt } from "../src/cli.js"; import { resolveInitialPrompt } from "../src/cli.js";
import { buildModelStatusSnapshotFromRecords, chooseRecommendedModel } from "../src/model/catalog.js"; import { buildModelStatusSnapshotFromRecords, chooseRecommendedModel } from "../src/model/catalog.js";
import { setDefaultModelSpec } from "../src/model/commands.js"; import { resolveModelProviderForCommand, setDefaultModelSpec } from "../src/model/commands.js";
function createAuthPath(contents: Record<string, unknown>): string { function createAuthPath(contents: Record<string, unknown>): string {
const root = mkdtempSync(join(tmpdir(), "feynman-auth-")); const root = mkdtempSync(join(tmpdir(), "feynman-auth-"));
@@ -42,6 +42,56 @@ test("setDefaultModelSpec accepts a unique bare model id from authenticated mode
assert.equal(settings.defaultModel, "gpt-5.4"); assert.equal(settings.defaultModel, "gpt-5.4");
}); });
test("setDefaultModelSpec accepts provider:model syntax for authenticated models", () => {
const authPath = createAuthPath({
google: { type: "api_key", key: "google-test-key" },
});
const settingsPath = join(mkdtempSync(join(tmpdir(), "feynman-settings-")), "settings.json");
setDefaultModelSpec(settingsPath, authPath, "google:gemini-3-pro-preview");
const settings = JSON.parse(readFileSync(settingsPath, "utf8")) as {
defaultProvider?: string;
defaultModel?: string;
};
assert.equal(settings.defaultProvider, "google");
assert.equal(settings.defaultModel, "gemini-3-pro-preview");
});
test("resolveModelProviderForCommand falls back to API-key providers when OAuth is unavailable", () => {
const authPath = createAuthPath({});
const resolved = resolveModelProviderForCommand(authPath, "google");
assert.equal(resolved?.kind, "api-key");
assert.equal(resolved?.id, "google");
});
test("resolveModelProviderForCommand prefers OAuth when a provider supports both auth modes", () => {
const authPath = createAuthPath({});
const resolved = resolveModelProviderForCommand(authPath, "anthropic");
assert.equal(resolved?.kind, "oauth");
assert.equal(resolved?.id, "anthropic");
});
test("setDefaultModelSpec prefers the explicitly configured provider when a bare model id is ambiguous", () => {
const authPath = createAuthPath({
openai: { type: "api_key", key: "openai-test-key" },
});
const settingsPath = join(mkdtempSync(join(tmpdir(), "feynman-settings-")), "settings.json");
setDefaultModelSpec(settingsPath, authPath, "gpt-5.4");
const settings = JSON.parse(readFileSync(settingsPath, "utf8")) as {
defaultProvider?: string;
defaultModel?: string;
};
assert.equal(settings.defaultProvider, "openai");
assert.equal(settings.defaultModel, "gpt-5.4");
});
test("buildModelStatusSnapshotFromRecords flags an invalid current model and suggests a replacement", () => { test("buildModelStatusSnapshotFromRecords flags an invalid current model and suggests a replacement", () => {
const snapshot = buildModelStatusSnapshotFromRecords( const snapshot = buildModelStatusSnapshotFromRecords(
[ [

View File

@@ -22,17 +22,18 @@ The `settings.json` file is the primary configuration file. It is created by `fe
```json ```json
{ {
"defaultModel": "anthropic:claude-sonnet-4-20250514", "defaultProvider": "anthropic",
"thinkingLevel": "medium" "defaultModel": "claude-sonnet-4-20250514",
"defaultThinkingLevel": "medium"
} }
``` ```
## Model configuration ## Model configuration
The `defaultModel` field sets which model is used when you launch Feynman without the `--model` flag. The format is `provider:model-name`. You can change it via the CLI: The `defaultProvider` and `defaultModel` fields set which model is used when you launch Feynman without the `--model` flag. You can change them via the CLI:
```bash ```bash
feynman model set anthropic:claude-opus-4-20250514 feynman model set anthropic/claude-opus-4-20250514
``` ```
To see all models you have configured: To see all models you have configured:
@@ -48,6 +49,7 @@ To add another provider, authenticate it first:
```bash ```bash
feynman model login anthropic feynman model login anthropic
feynman model login google feynman model login google
feynman model login amazon-bedrock
``` ```
Then switch the default model: Then switch the default model:
@@ -56,6 +58,8 @@ Then switch the default model:
feynman model set anthropic/claude-opus-4-6 feynman model set anthropic/claude-opus-4-6
``` ```
The `model set` command accepts both `provider/model` and `provider:model` formats. `feynman model login google` opens the API-key flow directly, while `feynman model login amazon-bedrock` verifies the AWS credential chain that Pi uses for Bedrock access.
## Subagent model overrides ## Subagent model overrides
Feynman's bundled subagents inherit the main default model unless you override them explicitly. Inside the REPL, run: Feynman's bundled subagents inherit the main default model unless you override them explicitly. Inside the REPL, run:
@@ -90,7 +94,8 @@ Feynman respects the following environment variables, which take precedence over
| `FEYNMAN_THINKING` | Override the thinking level | | `FEYNMAN_THINKING` | Override the thinking level |
| `ANTHROPIC_API_KEY` | Anthropic API key | | `ANTHROPIC_API_KEY` | Anthropic API key |
| `OPENAI_API_KEY` | OpenAI API key | | `OPENAI_API_KEY` | OpenAI API key |
| `GOOGLE_API_KEY` | Google AI API key | | `GEMINI_API_KEY` | Google Gemini API key |
| `AWS_PROFILE` | Preferred AWS profile for Amazon Bedrock |
| `TAVILY_API_KEY` | Tavily web search API key | | `TAVILY_API_KEY` | Tavily web search API key |
| `SERPER_API_KEY` | Serper web search API key | | `SERPER_API_KEY` | Serper web search API key |

View File

@@ -27,6 +27,12 @@ irm https://feynman.is/install.ps1 | iex
This installs the Windows runtime bundle under `%LOCALAPPDATA%\Programs\feynman`, adds its launcher to your user `PATH`, and lets you re-run the installer at any time to update. This installs the Windows runtime bundle under `%LOCALAPPDATA%\Programs\feynman`, adds its launcher to your user `PATH`, and lets you re-run the installer at any time to update.
## Updating the standalone app
To update the standalone Feynman app on macOS, Linux, or Windows, rerun the installer you originally used. That replaces the downloaded runtime bundle with the latest tagged release.
`feynman update` is different: it updates installed Pi packages inside Feynman's environment, not the standalone app bundle itself.
## Skills only ## Skills only
If you only want Feynman's research skills and not the full terminal runtime, install the skill library separately. If you only want Feynman's research skills and not the full terminal runtime, install the skill library separately.

View File

@@ -28,7 +28,7 @@ Feynman supports multiple model providers. The setup wizard presents a list of a
google:gemini-2.5-pro google:gemini-2.5-pro
``` ```
The model you choose here becomes the default for all sessions. You can override it per-session with the `--model` flag or change it later via `feynman model set <provider:model>`. The model you choose here becomes the default for all sessions. You can override it per-session with the `--model` flag or change it later via `feynman model set <provider/model>` or `feynman model set <provider:model>`.
## Stage 2: Authentication ## Stage 2: Authentication
@@ -42,6 +42,16 @@ For API key providers, you are prompted to paste your key directly:
Keys are encrypted at rest and never sent anywhere except the provider's API endpoint. Keys are encrypted at rest and never sent anywhere except the provider's API endpoint.
### Amazon Bedrock
For Amazon Bedrock, choose:
```text
Amazon Bedrock (AWS credential chain)
```
Feynman verifies the same AWS credential chain Pi uses at runtime, including `AWS_PROFILE`, `~/.aws` credentials/config, SSO, ECS/IRSA, and EC2 instance roles. Once that check passes, Bedrock models become available in `feynman model list` without needing a traditional API key.
### Local models: Ollama, LM Studio, vLLM ### Local models: Ollama, LM Studio, vLLM
If you want to use a model running locally, choose the API-key flow and then select: If you want to use a model running locally, choose the API-key flow and then select:

View File

@@ -23,11 +23,11 @@ This page covers the dedicated Feynman CLI commands and flags. Workflow commands
| Command | Description | | Command | Description |
| --- | --- | | --- | --- |
| `feynman model list` | List available models in Pi auth storage | | `feynman model list` | List available models in Pi auth storage |
| `feynman model login [id]` | Login to a Pi OAuth model provider | | `feynman model login [id]` | Authenticate a model provider with OAuth or API-key setup |
| `feynman model logout [id]` | Logout from a Pi OAuth model provider | | `feynman model logout [id]` | Clear stored auth for a model provider |
| `feynman model set <provider:model>` | Set the default model for all sessions | | `feynman model set <provider/model>` | Set the default model for all sessions |
These commands manage your model provider configuration. The `model set` command updates `~/.feynman/settings.json` with the new default. The format is `provider:model-name`, for example `anthropic:claude-sonnet-4-20250514`. These commands manage your model provider configuration. The `model set` command updates `~/.feynman/settings.json` with the new default. It accepts either `provider/model-name` or `provider:model-name`, for example `anthropic/claude-sonnet-4-20250514` or `anthropic:claude-sonnet-4-20250514`. Running `feynman model login google` or `feynman model login amazon-bedrock` routes directly into the relevant API-key setup flow instead of requiring the interactive picker.
## AlphaXiv commands ## AlphaXiv commands
@@ -76,7 +76,7 @@ These are equivalent to launching the REPL and typing the corresponding slash co
| Flag | Description | | Flag | Description |
| --- | --- | | --- | --- |
| `--prompt "<text>"` | Run one prompt and exit (one-shot mode) | | `--prompt "<text>"` | Run one prompt and exit (one-shot mode) |
| `--model <provider:model>` | Force a specific model for this session | | `--model <provider/model|provider:model>` | Force a specific model for this session |
| `--thinking <level>` | Set thinking level: `off`, `minimal`, `low`, `medium`, `high`, `xhigh` | | `--thinking <level>` | Set thinking level: `off`, `minimal`, `low`, `medium`, `high`, `xhigh` |
| `--cwd <path>` | Set the working directory for all file operations | | `--cwd <path>` | Set the working directory for all file operations |
| `--session-dir <path>` | Set the session storage directory | | `--session-dir <path>` | Set the session storage directory |

View File

@@ -74,3 +74,5 @@ feynman update pi-subagents
``` ```
Running `feynman update` without arguments updates everything. Pass a specific package name to update just that one. Updates are safe and preserve your configuration. Running `feynman update` without arguments updates everything. Pass a specific package name to update just that one. Updates are safe and preserve your configuration.
This command updates Pi packages inside Feynman's environment. To upgrade the standalone Feynman app itself, rerun the installer from the [Installation guide](/docs/getting-started/installation).