diff --git a/CHANGELOG.md b/CHANGELOG.md index baadef1..83b55f7 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -113,3 +113,21 @@ Use this file to track chronology, not release notes. Keep entries short, factua - Failed / learned: The remaining ValiChord PR is still stale and mixes a real prompt/skill update with unrelated branch churn; it is a review/triage item, not a clean merge candidate. - Blockers: No local build blockers remain; issue/PR closure still depends on the final push landing on `main`. - Next: Push the verified cleanup commit, then close issues fixed by the dependency bump plus the new discoverability/service-tier/Windows patches, and close the stale ValiChord PR explicitly instead of leaving it open indefinitely. + +### 2026-04-09 09:37 PDT — windows-startup-import-specifiers + +- Objective: Fix Windows startup failures where `feynman` exits before the Pi child process initializes. +- Changed: Converted the Node preload module paths passed via `node --import` in `src/pi/launch.ts` to `file://` specifiers using a new `toNodeImportSpecifier(...)` helper in `src/pi/runtime.ts`; expanded `scripts/patch-embedded-pi.mjs` so it also patches the bundled workspace copy of Pi's extension loader when present. +- Verified: Added a regression test in `tests/pi-runtime.test.ts` covering absolute-path to `file://` conversion for preload imports; ran `npm test`, `npm run typecheck`, and `npm run build`. +- Failed / learned: The raw Windows `ERR_UNSUPPORTED_ESM_URL_SCHEME` stack is more consistent with Node rejecting the child-process `--import C:\\...` preload before Pi starts than with a normal in-app extension load failure. +- Blockers: Windows runtime execution was not available locally, so the fix is verified by code path inspection and automated tests rather than an actual Windows shell run. +- Next: Ask the affected user to reinstall or update to the next published package once released, and confirm the Windows REPL now starts from a normal PowerShell session. + +### 2026-04-09 11:02 PDT — tracker-hardening-pass + +- Objective: Triage the open repo backlog, land the highest-signal fixes locally, and add guardrails against stale promotional workflow content. +- Changed: Hardened Windows launch paths in `bin/feynman.js`, `scripts/build-native-bundle.mjs`, and `scripts/install/install.ps1`; set npm prefix overrides earlier in `scripts/patch-embedded-pi.mjs`; added a `pi-web-access` runtime patch helper plus `FEYNMAN_WEB_SEARCH_CONFIG` env wiring so bundled web search reads the same `~/.feynman/web-search.json` that doctor/status report; taught `src/pi/web-access.ts` to honor the legacy `route` key; fixed bundled skill references and expanded the skills-only installers/docs to ship the prompt and guidance files those skills reference; added regression tests for config paths, catalog snapshot edges, skill-path packaging, `pi-web-access` patching, and blocked promotional content. +- Verified: Ran `npm test`, `npm run typecheck`, and `npm run build` successfully after the full maintenance pass. +- Failed / learned: The skills-only install issue was not just docs drift; the shipped `SKILL.md` files referenced prompt paths that only made sense after installation, so the repo needed both path normalization and packaging changes. +- Blockers: Remote issue/PR closure and merge actions still depend on the final reviewed branch state being pushed. +- Next: Push the validated fixes, close the duplicate Windows/reporting issues they supersede, reject the promotional ValiChord PR explicitly, and then review whether the remaining docs-only or feature PRs should be merged separately. diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 77cd88c..9ddbea9 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -59,6 +59,7 @@ npm run build - Avoid refactor-only PRs unless they are necessary to unblock a real fix or requested by a maintainer. - Do not silently change release behavior, installer behavior, or runtime defaults without documenting the reason in the PR. - Use American English in docs, comments, prompts, UI copy, and examples. +- Do not add bundled prompts, skills, or docs whose primary purpose is to market, endorse, or funnel users toward a third-party product or service. Product integrations must be justified by user-facing utility and written in neutral language. ## Repo-Specific Checks diff --git a/README.md b/README.md index 3110060..017a847 100644 --- a/README.md +++ b/README.md @@ -25,7 +25,7 @@ curl -fsSL https://feynman.is/install | bash irm https://feynman.is/install.ps1 | iex ``` -The one-line installer fetches the latest tagged release. To pin a version, pass it explicitly, for example `curl -fsSL https://feynman.is/install | bash -s -- 0.2.16`. +The one-line installer fetches the latest tagged release. To pin a version, pass it explicitly, for example `curl -fsSL https://feynman.is/install | bash -s -- 0.2.17`. If you install via `pnpm` or `bun` instead of the standalone bundle, Feynman requires Node.js `20.19.0` or newer. @@ -63,6 +63,8 @@ curl -fsSL https://feynman.is/install-skills | bash -s -- --repo That installs into `.agents/skills/feynman` under the current repository. +These installers download the bundled `skills/` and `prompts/` trees plus the repo guidance files referenced by those skills. They do not install the Feynman terminal, bundled Node runtime, auth storage, or Pi packages. + --- ### What you type → what happens diff --git a/bin/feynman.js b/bin/feynman.js index f94acb6..085449b 100755 --- a/bin/feynman.js +++ b/bin/feynman.js @@ -1,4 +1,7 @@ #!/usr/bin/env node +import { resolve } from "node:path"; +import { pathToFileURL } from "node:url"; + const MIN_NODE_VERSION = "20.19.0"; function parseNodeVersion(version) { @@ -27,5 +30,7 @@ if (compareNodeVersions(parseNodeVersion(process.versions.node), parseNodeVersio : "curl -fsSL https://feynman.is/install | bash"); process.exit(1); } -await import(new URL("../scripts/patch-embedded-pi.mjs", import.meta.url).href); -await import(new URL("../dist/index.js", import.meta.url).href); +const here = import.meta.dirname; + +await import(pathToFileURL(resolve(here, "..", "scripts", "patch-embedded-pi.mjs")).href); +await import(pathToFileURL(resolve(here, "..", "dist", "index.js")).href); diff --git a/package-lock.json b/package-lock.json index 7793523..3e042a0 100644 --- a/package-lock.json +++ b/package-lock.json @@ -1,12 +1,12 @@ { "name": "@companion-ai/feynman", - "version": "0.2.16", + "version": "0.2.17", "lockfileVersion": 3, "requires": true, "packages": { "": { "name": "@companion-ai/feynman", - "version": "0.2.16", + "version": "0.2.17", "hasInstallScript": true, "license": "MIT", "dependencies": { diff --git a/package.json b/package.json index e8eaaf7..0c124e4 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { "name": "@companion-ai/feynman", - "version": "0.2.16", + "version": "0.2.17", "description": "Research-first CLI agent built on Pi and alphaXiv", "license": "MIT", "type": "module", diff --git a/scripts/build-native-bundle.mjs b/scripts/build-native-bundle.mjs index 29e126c..9da4a53 100644 --- a/scripts/build-native-bundle.mjs +++ b/scripts/build-native-bundle.mjs @@ -275,7 +275,8 @@ function writeLauncher(bundleRoot, target) { "@echo off", "setlocal", 'set "ROOT=%~dp0"', - '"%ROOT%node\\node.exe" "%ROOT%app\\bin\\feynman.js" %*', + 'if "%ROOT:~-1%"=="\\" set "ROOT=%ROOT:~0,-1%"', + '"%ROOT%\\node\\node.exe" "%ROOT%\\app\\bin\\feynman.js" %*', "", ].join("\r\n"), "utf8", diff --git a/scripts/install/install-skills.ps1 b/scripts/install/install-skills.ps1 index c650efa..b081451 100644 --- a/scripts/install/install-skills.ps1 +++ b/scripts/install/install-skills.ps1 @@ -92,8 +92,9 @@ try { } $skillsSource = Join-Path $sourceRoot.FullName "skills" - if (-not (Test-Path $skillsSource)) { - throw "Could not find skills/ in downloaded archive." + $promptsSource = Join-Path $sourceRoot.FullName "prompts" + if (-not (Test-Path $skillsSource) -or -not (Test-Path $promptsSource)) { + throw "Could not find the bundled skills resources in the downloaded archive." } $installParent = Split-Path $installDir -Parent @@ -107,6 +108,10 @@ try { New-Item -ItemType Directory -Path $installDir -Force | Out-Null Copy-Item -Path (Join-Path $skillsSource "*") -Destination $installDir -Recurse -Force + New-Item -ItemType Directory -Path (Join-Path $installDir "prompts") -Force | Out-Null + Copy-Item -Path (Join-Path $promptsSource "*") -Destination (Join-Path $installDir "prompts") -Recurse -Force + Copy-Item -Path (Join-Path $sourceRoot.FullName "AGENTS.md") -Destination (Join-Path $installDir "AGENTS.md") -Force + Copy-Item -Path (Join-Path $sourceRoot.FullName "CONTRIBUTING.md") -Destination (Join-Path $installDir "CONTRIBUTING.md") -Force Write-Host "==> Installed skills to $installDir" if ($Scope -eq "Repo") { diff --git a/scripts/install/install-skills.sh b/scripts/install/install-skills.sh index 0e1df44..f832ae9 100644 --- a/scripts/install/install-skills.sh +++ b/scripts/install/install-skills.sh @@ -181,8 +181,8 @@ step "Extracting skills" tar -xzf "$archive_path" -C "$extract_dir" source_root="$(find "$extract_dir" -mindepth 1 -maxdepth 1 -type d | head -n 1)" -if [ -z "$source_root" ] || [ ! -d "$source_root/skills" ]; then - echo "Could not find skills/ in downloaded archive." >&2 +if [ -z "$source_root" ] || [ ! -d "$source_root/skills" ] || [ ! -d "$source_root/prompts" ]; then + echo "Could not find the bundled skills resources in the downloaded archive." >&2 exit 1 fi @@ -190,6 +190,10 @@ mkdir -p "$(dirname "$install_dir")" rm -rf "$install_dir" mkdir -p "$install_dir" cp -R "$source_root/skills/." "$install_dir/" +mkdir -p "$install_dir/prompts" +cp -R "$source_root/prompts/." "$install_dir/prompts/" +cp "$source_root/AGENTS.md" "$install_dir/AGENTS.md" +cp "$source_root/CONTRIBUTING.md" "$install_dir/CONTRIBUTING.md" step "Installed skills to $install_dir" case "$SCOPE" in diff --git a/scripts/install/install.ps1 b/scripts/install/install.ps1 index 865c0c7..06564b5 100644 --- a/scripts/install/install.ps1 +++ b/scripts/install/install.ps1 @@ -125,12 +125,18 @@ Workarounds: New-Item -ItemType Directory -Path $installBinDir -Force | Out-Null $shimPath = Join-Path $installBinDir "feynman.cmd" + $shimPs1Path = Join-Path $installBinDir "feynman.ps1" Write-Host "==> Linking feynman into $installBinDir" @" @echo off -"$bundleDir\feynman.cmd" %* +CALL "$bundleDir\feynman.cmd" %* "@ | Set-Content -Path $shimPath -Encoding ASCII + @" +`$BundleDir = "$bundleDir" +& "`$BundleDir\node\node.exe" "`$BundleDir\app\bin\feynman.js" @args +"@ | Set-Content -Path $shimPs1Path -Encoding UTF8 + $currentUserPath = [Environment]::GetEnvironmentVariable("Path", "User") $alreadyOnPath = $false if ($currentUserPath) { diff --git a/scripts/lib/pi-web-access-patch.d.mts b/scripts/lib/pi-web-access-patch.d.mts new file mode 100644 index 0000000..ea07a72 --- /dev/null +++ b/scripts/lib/pi-web-access-patch.d.mts @@ -0,0 +1,2 @@ +export const PI_WEB_ACCESS_PATCH_TARGETS: string[]; +export function patchPiWebAccessSource(relativePath: string, source: string): string; diff --git a/scripts/lib/pi-web-access-patch.mjs b/scripts/lib/pi-web-access-patch.mjs new file mode 100644 index 0000000..f47c918 --- /dev/null +++ b/scripts/lib/pi-web-access-patch.mjs @@ -0,0 +1,32 @@ +export const PI_WEB_ACCESS_PATCH_TARGETS = [ + "index.ts", + "exa.ts", + "gemini-api.ts", + "gemini-search.ts", + "gemini-web.ts", + "github-extract.ts", + "perplexity.ts", + "video-extract.ts", + "youtube-extract.ts", +]; + +const LEGACY_CONFIG_EXPR = 'join(homedir(), ".pi", "web-search.json")'; +const PATCHED_CONFIG_EXPR = + 'process.env.FEYNMAN_WEB_SEARCH_CONFIG ?? process.env.PI_WEB_SEARCH_CONFIG ?? join(homedir(), ".pi", "web-search.json")'; + +export function patchPiWebAccessSource(relativePath, source) { + let patched = source; + + if (patched.includes(PATCHED_CONFIG_EXPR)) { + return patched; + } + + patched = patched.split(LEGACY_CONFIG_EXPR).join(PATCHED_CONFIG_EXPR); + + if (relativePath === "index.ts" && patched !== source) { + patched = patched.replace('import { join } from "node:path";', 'import { dirname, join } from "node:path";'); + patched = patched.replace('const dir = join(homedir(), ".pi");', "const dir = dirname(WEB_SEARCH_CONFIG_PATH);"); + } + + return patched; +} diff --git a/scripts/patch-embedded-pi.mjs b/scripts/patch-embedded-pi.mjs index af82f13..30553da 100644 --- a/scripts/patch-embedded-pi.mjs +++ b/scripts/patch-embedded-pi.mjs @@ -1,14 +1,21 @@ import { spawnSync } from "node:child_process"; import { existsSync, mkdirSync, readFileSync, rmSync, writeFileSync } from "node:fs"; import { createRequire } from "node:module"; +import { homedir } from "node:os"; import { dirname, resolve } from "node:path"; import { fileURLToPath } from "node:url"; import { FEYNMAN_LOGO_HTML } from "../logo.mjs"; import { patchPiExtensionLoaderSource } from "./lib/pi-extension-loader-patch.mjs"; +import { PI_WEB_ACCESS_PATCH_TARGETS, patchPiWebAccessSource } from "./lib/pi-web-access-patch.mjs"; import { PI_SUBAGENTS_PATCH_TARGETS, patchPiSubagentsSource } from "./lib/pi-subagents-patch.mjs"; const here = dirname(fileURLToPath(import.meta.url)); const appRoot = resolve(here, ".."); +const feynmanHome = resolve(process.env.FEYNMAN_HOME ?? homedir(), ".feynman"); +const feynmanNpmPrefix = resolve(feynmanHome, "npm-global"); +process.env.FEYNMAN_NPM_PREFIX = feynmanNpmPrefix; +process.env.NPM_CONFIG_PREFIX = feynmanNpmPrefix; +process.env.npm_config_prefix = feynmanNpmPrefix; const appRequire = createRequire(resolve(appRoot, "package.json")); const isGlobalInstall = process.env.npm_config_global === "true" || process.env.npm_config_location === "global"; @@ -57,6 +64,15 @@ const extensionLoaderPath = piPackageRoot ? resolve(piPackageRoot, "dist", "core const terminalPath = piTuiRoot ? resolve(piTuiRoot, "dist", "terminal.js") : null; const editorPath = piTuiRoot ? resolve(piTuiRoot, "dist", "components", "editor.js") : null; const workspaceRoot = resolve(appRoot, ".feynman", "npm", "node_modules"); +const workspaceExtensionLoaderPath = resolve( + workspaceRoot, + "@mariozechner", + "pi-coding-agent", + "dist", + "core", + "extensions", + "loader.js", +); const vendorOverrideRoot = resolve(appRoot, ".feynman", "vendor-overrides"); const piSubagentsRoot = resolve(workspaceRoot, "pi-subagents"); const webAccessPath = resolve(workspaceRoot, "pi-web-access", "index.ts"); @@ -76,7 +92,17 @@ const workspaceArchivePath = resolve(appRoot, ".feynman", "runtime-workspace.tgz function createInstallCommand(packageManager, packageSpecs) { switch (packageManager) { case "npm": - return ["install", "--prefer-offline", "--no-audit", "--no-fund", "--loglevel", "error", ...packageSpecs]; + return [ + "install", + "--global=false", + "--location=project", + "--prefer-offline", + "--no-audit", + "--no-fund", + "--loglevel", + "error", + ...packageSpecs, + ]; case "pnpm": return ["add", "--prefer-offline", "--reporter", "silent", ...packageSpecs]; case "bun": @@ -367,11 +393,15 @@ if (interactiveModePath && existsSync(interactiveModePath)) { } } -if (extensionLoaderPath && existsSync(extensionLoaderPath)) { - const source = readFileSync(extensionLoaderPath, "utf8"); +for (const loaderPath of [extensionLoaderPath, workspaceExtensionLoaderPath].filter(Boolean)) { + if (!existsSync(loaderPath)) { + continue; + } + + const source = readFileSync(loaderPath, "utf8"); const patched = patchPiExtensionLoaderSource(source); if (patched !== source) { - writeFileSync(extensionLoaderPath, patched, "utf8"); + writeFileSync(loaderPath, patched, "utf8"); } } @@ -560,6 +590,21 @@ if (existsSync(webAccessPath)) { } } +const piWebAccessRoot = resolve(workspaceRoot, "pi-web-access"); + +if (existsSync(piWebAccessRoot)) { + for (const relativePath of PI_WEB_ACCESS_PATCH_TARGETS) { + const entryPath = resolve(piWebAccessRoot, relativePath); + if (!existsSync(entryPath)) continue; + + const source = readFileSync(entryPath, "utf8"); + const patched = patchPiWebAccessSource(relativePath, source); + if (patched !== source) { + writeFileSync(entryPath, patched, "utf8"); + } + } +} + if (existsSync(sessionSearchIndexerPath)) { const source = readFileSync(sessionSearchIndexerPath, "utf8"); const original = 'const sessionsDir = path.join(os.homedir(), ".pi", "agent", "sessions");'; diff --git a/skills/autoresearch/SKILL.md b/skills/autoresearch/SKILL.md index 1cda828..9074213 100644 --- a/skills/autoresearch/SKILL.md +++ b/skills/autoresearch/SKILL.md @@ -5,7 +5,7 @@ description: Autonomous experiment loop that tries ideas, measures results, keep # Autoresearch -Run the `/autoresearch` workflow. Read the prompt template at `prompts/autoresearch.md` for the full procedure. +Run the `/autoresearch` workflow. Read the prompt template at `../prompts/autoresearch.md` for the full procedure. Tools used: `init_experiment`, `run_experiment`, `log_experiment` (from pi-autoresearch) diff --git a/skills/contributing/SKILL.md b/skills/contributing/SKILL.md index f1806c7..33378d7 100644 --- a/skills/contributing/SKILL.md +++ b/skills/contributing/SKILL.md @@ -5,7 +5,7 @@ description: Contribute changes to the Feynman repository itself. Use when the t # Contributing -Read `CONTRIBUTING.md` first, then `AGENTS.md` for repo-level agent conventions. +Read `../CONTRIBUTING.md` first, then `../AGENTS.md` for repo-level agent conventions. Use this skill when working on Feynman itself, especially for: diff --git a/skills/deep-research/SKILL.md b/skills/deep-research/SKILL.md index 291990e..b37859d 100644 --- a/skills/deep-research/SKILL.md +++ b/skills/deep-research/SKILL.md @@ -5,7 +5,7 @@ description: Run a thorough, source-heavy investigation on any topic. Use when t # Deep Research -Run the `/deepresearch` workflow. Read the prompt template at `prompts/deepresearch.md` for the full procedure. +Run the `/deepresearch` workflow. Read the prompt template at `../prompts/deepresearch.md` for the full procedure. Agents used: `researcher`, `verifier`, `reviewer` diff --git a/skills/jobs/SKILL.md b/skills/jobs/SKILL.md index 3bdef42..db6be23 100644 --- a/skills/jobs/SKILL.md +++ b/skills/jobs/SKILL.md @@ -5,6 +5,6 @@ description: Inspect active background research work including running processes # Jobs -Run the `/jobs` workflow. Read the prompt template at `prompts/jobs.md` for the full procedure. +Run the `/jobs` workflow. Read the prompt template at `../prompts/jobs.md` for the full procedure. Shows active `pi-processes`, scheduled `pi-schedule-prompt` entries, and running subagent tasks. diff --git a/skills/literature-review/SKILL.md b/skills/literature-review/SKILL.md index b9c995b..04ed6af 100644 --- a/skills/literature-review/SKILL.md +++ b/skills/literature-review/SKILL.md @@ -5,7 +5,7 @@ description: Run a literature review using paper search and primary-source synth # Literature Review -Run the `/lit` workflow. Read the prompt template at `prompts/lit.md` for the full procedure. +Run the `/lit` workflow. Read the prompt template at `../prompts/lit.md` for the full procedure. Agents used: `researcher`, `verifier`, `reviewer` diff --git a/skills/paper-code-audit/SKILL.md b/skills/paper-code-audit/SKILL.md index ce7ace6..cae2786 100644 --- a/skills/paper-code-audit/SKILL.md +++ b/skills/paper-code-audit/SKILL.md @@ -5,7 +5,7 @@ description: Compare a paper's claims against its public codebase. Use when the # Paper-Code Audit -Run the `/audit` workflow. Read the prompt template at `prompts/audit.md` for the full procedure. +Run the `/audit` workflow. Read the prompt template at `../prompts/audit.md` for the full procedure. Agents used: `researcher`, `verifier` diff --git a/skills/paper-writing/SKILL.md b/skills/paper-writing/SKILL.md index d453a4e..be37a5c 100644 --- a/skills/paper-writing/SKILL.md +++ b/skills/paper-writing/SKILL.md @@ -5,7 +5,7 @@ description: Turn research findings into a polished paper-style draft with secti # Paper Writing -Run the `/draft` workflow. Read the prompt template at `prompts/draft.md` for the full procedure. +Run the `/draft` workflow. Read the prompt template at `../prompts/draft.md` for the full procedure. Agents used: `writer`, `verifier` diff --git a/skills/peer-review/SKILL.md b/skills/peer-review/SKILL.md index c4536de..bcb7a34 100644 --- a/skills/peer-review/SKILL.md +++ b/skills/peer-review/SKILL.md @@ -5,7 +5,7 @@ description: Simulate a tough but constructive peer review of an AI research art # Peer Review -Run the `/review` workflow. Read the prompt template at `prompts/review.md` for the full procedure. +Run the `/review` workflow. Read the prompt template at `../prompts/review.md` for the full procedure. Agents used: `researcher`, `reviewer` diff --git a/skills/replication/SKILL.md b/skills/replication/SKILL.md index 27925c5..66356b4 100644 --- a/skills/replication/SKILL.md +++ b/skills/replication/SKILL.md @@ -5,7 +5,7 @@ description: Plan or execute a replication of a paper, claim, or benchmark. Use # Replication -Run the `/replicate` workflow. Read the prompt template at `prompts/replicate.md` for the full procedure. +Run the `/replicate` workflow. Read the prompt template at `../prompts/replicate.md` for the full procedure. Agents used: `researcher` diff --git a/skills/session-log/SKILL.md b/skills/session-log/SKILL.md index 2b4b160..29970f3 100644 --- a/skills/session-log/SKILL.md +++ b/skills/session-log/SKILL.md @@ -5,6 +5,6 @@ description: Write a durable session log capturing completed work, findings, ope # Session Log -Run the `/log` workflow. Read the prompt template at `prompts/log.md` for the full procedure. +Run the `/log` workflow. Read the prompt template at `../prompts/log.md` for the full procedure. Output: session log in `notes/session-logs/`. diff --git a/skills/source-comparison/SKILL.md b/skills/source-comparison/SKILL.md index 5d43a69..ba7848e 100644 --- a/skills/source-comparison/SKILL.md +++ b/skills/source-comparison/SKILL.md @@ -5,7 +5,7 @@ description: Compare multiple sources on a topic and produce a grounded comparis # Source Comparison -Run the `/compare` workflow. Read the prompt template at `prompts/compare.md` for the full procedure. +Run the `/compare` workflow. Read the prompt template at `../prompts/compare.md` for the full procedure. Agents used: `researcher`, `verifier` diff --git a/skills/watch/SKILL.md b/skills/watch/SKILL.md index 573418d..f153acb 100644 --- a/skills/watch/SKILL.md +++ b/skills/watch/SKILL.md @@ -5,7 +5,7 @@ description: Set up a recurring research watch on a topic, company, paper area, # Watch -Run the `/watch` workflow. Read the prompt template at `prompts/watch.md` for the full procedure. +Run the `/watch` workflow. Read the prompt template at `../prompts/watch.md` for the full procedure. Agents used: `researcher` diff --git a/src/pi/launch.ts b/src/pi/launch.ts index f5286a8..097ac2d 100644 --- a/src/pi/launch.ts +++ b/src/pi/launch.ts @@ -1,7 +1,7 @@ import { spawn } from "node:child_process"; import { existsSync } from "node:fs"; -import { buildPiArgs, buildPiEnv, type PiRuntimeOptions, resolvePiPaths } from "./runtime.js"; +import { buildPiArgs, buildPiEnv, type PiRuntimeOptions, resolvePiPaths, toNodeImportSpecifier } from "./runtime.js"; import { ensureSupportedNodeVersion } from "../system/node-version.js"; export async function launchPiChat(options: PiRuntimeOptions): Promise { @@ -23,8 +23,8 @@ export async function launchPiChat(options: PiRuntimeOptions): Promise { } const importArgs = useDevPolyfill - ? ["--import", tsxLoaderPath, "--import", promisePolyfillSourcePath] - : ["--import", promisePolyfillPath]; + ? ["--import", toNodeImportSpecifier(tsxLoaderPath), "--import", toNodeImportSpecifier(promisePolyfillSourcePath)] + : ["--import", toNodeImportSpecifier(promisePolyfillPath)]; const child = spawn(process.execPath, [...importArgs, piCliPath, ...buildPiArgs(options)], { cwd: options.workingDir, diff --git a/src/pi/runtime.ts b/src/pi/runtime.ts index 9ab3e54..c0d0ce4 100644 --- a/src/pi/runtime.ts +++ b/src/pi/runtime.ts @@ -1,5 +1,6 @@ import { existsSync, readFileSync } from "node:fs"; -import { delimiter, dirname, resolve } from "node:path"; +import { delimiter, dirname, isAbsolute, resolve } from "node:path"; +import { pathToFileURL } from "node:url"; import { BROWSER_FALLBACK_PATHS, @@ -47,6 +48,10 @@ export function resolvePiPaths(appRoot: string) { }; } +export function toNodeImportSpecifier(modulePath: string): string { + return isAbsolute(modulePath) ? pathToFileURL(modulePath).href : modulePath; +} + export function validatePiInstallation(appRoot: string): string[] { const paths = resolvePiPaths(appRoot); const missing: string[] = []; @@ -97,6 +102,7 @@ export function buildPiEnv(options: PiRuntimeOptions): NodeJS.ProcessEnv { const paths = resolvePiPaths(options.appRoot); const feynmanNpmPrefixPath = getFeynmanNpmPrefixPath(options.feynmanAgentDir); const feynmanNpmBinPath = resolve(feynmanNpmPrefixPath, "bin"); + const feynmanWebSearchConfigPath = resolve(dirname(options.feynmanAgentDir), "web-search.json"); const currentPath = process.env.PATH ?? ""; const binEntries = [paths.nodeModulesBinPath, resolve(paths.piWorkspaceNodeModulesPath, ".bin"), feynmanNpmBinPath]; @@ -108,6 +114,7 @@ export function buildPiEnv(options: PiRuntimeOptions): NodeJS.ProcessEnv { FEYNMAN_VERSION: options.feynmanVersion, FEYNMAN_SESSION_DIR: options.sessionDir, FEYNMAN_MEMORY_DIR: resolve(dirname(options.feynmanAgentDir), "memory"), + FEYNMAN_WEB_SEARCH_CONFIG: feynmanWebSearchConfigPath, FEYNMAN_NODE_EXECUTABLE: process.execPath, FEYNMAN_BIN_PATH: resolve(options.appRoot, "bin", "feynman.js"), FEYNMAN_NPM_PREFIX: feynmanNpmPrefixPath, diff --git a/src/pi/web-access.ts b/src/pi/web-access.ts index f439ae1..e4f1071 100644 --- a/src/pi/web-access.ts +++ b/src/pi/web-access.ts @@ -5,6 +5,7 @@ import { resolve } from "node:path"; export type PiWebSearchProvider = "auto" | "perplexity" | "exa" | "gemini"; export type PiWebAccessConfig = Record & { + route?: PiWebSearchProvider; provider?: PiWebSearchProvider; searchProvider?: PiWebSearchProvider; perplexityApiKey?: string; @@ -80,8 +81,9 @@ export function getPiWebAccessStatus( config: PiWebAccessConfig = loadPiWebAccessConfig(), configPath = getPiWebSearchConfigPath(), ): PiWebAccessStatus { - const searchProvider = normalizeProvider(config.searchProvider) ?? "auto"; - const requestProvider = normalizeProvider(config.provider) ?? searchProvider; + const searchProvider = + normalizeProvider(config.searchProvider) ?? normalizeProvider(config.route) ?? normalizeProvider(config.provider) ?? "auto"; + const requestProvider = normalizeProvider(config.provider) ?? normalizeProvider(config.route) ?? searchProvider; const perplexityConfigured = Boolean(normalizeNonEmptyString(config.perplexityApiKey)); const exaConfigured = Boolean(normalizeNonEmptyString(config.exaApiKey)); const geminiApiConfigured = Boolean(normalizeNonEmptyString(config.geminiApiKey)); diff --git a/tests/catalog-snapshot.test.ts b/tests/catalog-snapshot.test.ts new file mode 100644 index 0000000..96a321b --- /dev/null +++ b/tests/catalog-snapshot.test.ts @@ -0,0 +1,110 @@ +import test from "node:test"; +import assert from "node:assert/strict"; + +import { buildModelStatusSnapshotFromRecords } from "../src/model/catalog.js"; + +test("buildModelStatusSnapshotFromRecords returns empty guidance when model is set and valid", () => { + const snapshot = buildModelStatusSnapshotFromRecords( + [{ provider: "anthropic", id: "claude-opus-4-6" }], + [{ provider: "anthropic", id: "claude-opus-4-6" }], + "anthropic/claude-opus-4-6", + ); + + assert.equal(snapshot.currentValid, true); + assert.equal(snapshot.current, "anthropic/claude-opus-4-6"); + assert.equal(snapshot.guidance.length, 0); +}); + +test("buildModelStatusSnapshotFromRecords emits guidance when no models are available", () => { + const snapshot = buildModelStatusSnapshotFromRecords([], [], undefined); + + assert.equal(snapshot.currentValid, false); + assert.equal(snapshot.current, undefined); + assert.equal(snapshot.recommended, undefined); + assert.ok(snapshot.guidance.some((line) => line.includes("No authenticated Pi models"))); +}); + +test("buildModelStatusSnapshotFromRecords emits guidance when no default model is set", () => { + const snapshot = buildModelStatusSnapshotFromRecords( + [{ provider: "openai", id: "gpt-5.4" }], + [{ provider: "openai", id: "gpt-5.4" }], + undefined, + ); + + assert.equal(snapshot.currentValid, false); + assert.equal(snapshot.current, undefined); + assert.ok(snapshot.guidance.some((line) => line.includes("No default research model"))); +}); + +test("buildModelStatusSnapshotFromRecords marks provider as configured only when it has available models", () => { + const snapshot = buildModelStatusSnapshotFromRecords( + [ + { provider: "anthropic", id: "claude-opus-4-6" }, + { provider: "openai", id: "gpt-5.4" }, + ], + [{ provider: "openai", id: "gpt-5.4" }], + "openai/gpt-5.4", + ); + + const anthropicProvider = snapshot.providers.find((provider) => provider.id === "anthropic"); + const openaiProvider = snapshot.providers.find((provider) => provider.id === "openai"); + + assert.ok(anthropicProvider); + assert.equal(anthropicProvider!.configured, false); + assert.equal(anthropicProvider!.supportedModels, 1); + assert.equal(anthropicProvider!.availableModels, 0); + + assert.ok(openaiProvider); + assert.equal(openaiProvider!.configured, true); + assert.equal(openaiProvider!.supportedModels, 1); + assert.equal(openaiProvider!.availableModels, 1); +}); + +test("buildModelStatusSnapshotFromRecords marks provider as current when selected model belongs to it", () => { + const snapshot = buildModelStatusSnapshotFromRecords( + [ + { provider: "anthropic", id: "claude-opus-4-6" }, + { provider: "openai", id: "gpt-5.4" }, + ], + [ + { provider: "anthropic", id: "claude-opus-4-6" }, + { provider: "openai", id: "gpt-5.4" }, + ], + "anthropic/claude-opus-4-6", + ); + + const anthropicProvider = snapshot.providers.find((provider) => provider.id === "anthropic"); + const openaiProvider = snapshot.providers.find((provider) => provider.id === "openai"); + + assert.equal(anthropicProvider!.current, true); + assert.equal(openaiProvider!.current, false); +}); + +test("buildModelStatusSnapshotFromRecords returns available models sorted by research preference", () => { + const snapshot = buildModelStatusSnapshotFromRecords( + [ + { provider: "openai", id: "gpt-5.4" }, + { provider: "anthropic", id: "claude-opus-4-6" }, + ], + [ + { provider: "openai", id: "gpt-5.4" }, + { provider: "anthropic", id: "claude-opus-4-6" }, + ], + undefined, + ); + + assert.equal(snapshot.availableModels[0], "anthropic/claude-opus-4-6"); + assert.equal(snapshot.availableModels[1], "openai/gpt-5.4"); + assert.equal(snapshot.recommended, "anthropic/claude-opus-4-6"); +}); + +test("buildModelStatusSnapshotFromRecords sets currentValid false when current model is not in available list", () => { + const snapshot = buildModelStatusSnapshotFromRecords( + [{ provider: "anthropic", id: "claude-opus-4-6" }], + [], + "anthropic/claude-opus-4-6", + ); + + assert.equal(snapshot.currentValid, false); + assert.equal(snapshot.current, "anthropic/claude-opus-4-6"); +}); diff --git a/tests/config-paths.test.ts b/tests/config-paths.test.ts new file mode 100644 index 0000000..f016350 --- /dev/null +++ b/tests/config-paths.test.ts @@ -0,0 +1,92 @@ +import test from "node:test"; +import assert from "node:assert/strict"; +import { existsSync, mkdtempSync, rmSync } from "node:fs"; +import { tmpdir } from "node:os"; +import { join, resolve } from "node:path"; + +import { + ensureFeynmanHome, + getBootstrapStatePath, + getDefaultSessionDir, + getFeynmanAgentDir, + getFeynmanHome, + getFeynmanMemoryDir, + getFeynmanStateDir, +} from "../src/config/paths.js"; + +test("getFeynmanHome uses FEYNMAN_HOME env var when set", () => { + const previous = process.env.FEYNMAN_HOME; + try { + process.env.FEYNMAN_HOME = "/custom/home"; + assert.equal(getFeynmanHome(), resolve("/custom/home", ".feynman")); + } finally { + if (previous === undefined) { + delete process.env.FEYNMAN_HOME; + } else { + process.env.FEYNMAN_HOME = previous; + } + } +}); + +test("getFeynmanHome falls back to homedir when FEYNMAN_HOME is unset", () => { + const previous = process.env.FEYNMAN_HOME; + try { + delete process.env.FEYNMAN_HOME; + const home = getFeynmanHome(); + assert.ok(home.endsWith(".feynman"), `expected path ending in .feynman, got: ${home}`); + assert.ok(!home.includes("undefined"), `expected no 'undefined' in path, got: ${home}`); + } finally { + if (previous === undefined) { + delete process.env.FEYNMAN_HOME; + } else { + process.env.FEYNMAN_HOME = previous; + } + } +}); + +test("getFeynmanAgentDir resolves to /agent", () => { + assert.equal(getFeynmanAgentDir("/some/home"), resolve("/some/home", "agent")); +}); + +test("getFeynmanMemoryDir resolves to /memory", () => { + assert.equal(getFeynmanMemoryDir("/some/home"), resolve("/some/home", "memory")); +}); + +test("getFeynmanStateDir resolves to /.state", () => { + assert.equal(getFeynmanStateDir("/some/home"), resolve("/some/home", ".state")); +}); + +test("getDefaultSessionDir resolves to /sessions", () => { + assert.equal(getDefaultSessionDir("/some/home"), resolve("/some/home", "sessions")); +}); + +test("getBootstrapStatePath resolves to /.state/bootstrap.json", () => { + assert.equal(getBootstrapStatePath("/some/home"), resolve("/some/home", ".state", "bootstrap.json")); +}); + +test("ensureFeynmanHome creates all required subdirectories", () => { + const root = mkdtempSync(join(tmpdir(), "feynman-paths-")); + try { + const home = join(root, "home"); + ensureFeynmanHome(home); + + assert.ok(existsSync(home), "home dir should exist"); + assert.ok(existsSync(join(home, "agent")), "agent dir should exist"); + assert.ok(existsSync(join(home, "memory")), "memory dir should exist"); + assert.ok(existsSync(join(home, ".state")), ".state dir should exist"); + assert.ok(existsSync(join(home, "sessions")), "sessions dir should exist"); + } finally { + rmSync(root, { recursive: true, force: true }); + } +}); + +test("ensureFeynmanHome is idempotent when dirs already exist", () => { + const root = mkdtempSync(join(tmpdir(), "feynman-paths-")); + try { + const home = join(root, "home"); + ensureFeynmanHome(home); + assert.doesNotThrow(() => ensureFeynmanHome(home)); + } finally { + rmSync(root, { recursive: true, force: true }); + } +}); diff --git a/tests/content-policy.test.ts b/tests/content-policy.test.ts new file mode 100644 index 0000000..806f591 --- /dev/null +++ b/tests/content-policy.test.ts @@ -0,0 +1,32 @@ +import test from "node:test"; +import assert from "node:assert/strict"; +import { readdirSync, readFileSync } from "node:fs"; +import { dirname, join, resolve } from "node:path"; +import { fileURLToPath } from "node:url"; + +const repoRoot = resolve(dirname(fileURLToPath(import.meta.url)), ".."); +const bannedPatterns = [/ValiChord/i, /Harmony Record/i, /harmony_record_/i]; + +function collectMarkdownFiles(root: string): string[] { + const files: string[] = []; + for (const entry of readdirSync(root, { withFileTypes: true })) { + const fullPath = join(root, entry.name); + if (entry.isDirectory()) { + files.push(...collectMarkdownFiles(fullPath)); + continue; + } + if (entry.isFile() && fullPath.endsWith(".md")) { + files.push(fullPath); + } + } + return files; +} + +test("bundled prompts and skills do not contain blocked promotional product content", () => { + for (const filePath of [...collectMarkdownFiles(join(repoRoot, "prompts")), ...collectMarkdownFiles(join(repoRoot, "skills"))]) { + const content = readFileSync(filePath, "utf8"); + for (const pattern of bannedPatterns) { + assert.doesNotMatch(content, pattern, `${filePath} contains blocked promotional pattern ${pattern}`); + } + } +}); diff --git a/tests/pi-runtime.test.ts b/tests/pi-runtime.test.ts index 7190677..3425c1f 100644 --- a/tests/pi-runtime.test.ts +++ b/tests/pi-runtime.test.ts @@ -1,7 +1,8 @@ import test from "node:test"; import assert from "node:assert/strict"; +import { pathToFileURL } from "node:url"; -import { applyFeynmanPackageManagerEnv, buildPiArgs, buildPiEnv, resolvePiPaths } from "../src/pi/runtime.js"; +import { applyFeynmanPackageManagerEnv, buildPiArgs, buildPiEnv, resolvePiPaths, toNodeImportSpecifier } from "../src/pi/runtime.js"; test("buildPiArgs includes configured runtime paths and prompt", () => { const args = buildPiArgs({ @@ -106,3 +107,11 @@ test("resolvePiPaths includes the Promise.withResolvers polyfill path", () => { assert.equal(paths.promisePolyfillPath, "/repo/feynman/dist/system/promise-polyfill.js"); }); + +test("toNodeImportSpecifier converts absolute preload paths to file URLs", () => { + assert.equal( + toNodeImportSpecifier("/repo/feynman/dist/system/promise-polyfill.js"), + pathToFileURL("/repo/feynman/dist/system/promise-polyfill.js").href, + ); + assert.equal(toNodeImportSpecifier("tsx"), "tsx"); +}); diff --git a/tests/pi-web-access-patch.test.ts b/tests/pi-web-access-patch.test.ts new file mode 100644 index 0000000..321e472 --- /dev/null +++ b/tests/pi-web-access-patch.test.ts @@ -0,0 +1,48 @@ +import test from "node:test"; +import assert from "node:assert/strict"; + +import { patchPiWebAccessSource } from "../scripts/lib/pi-web-access-patch.mjs"; + +test("patchPiWebAccessSource rewrites legacy Pi web-search config paths", () => { + const input = [ + 'import { join } from "node:path";', + 'import { homedir } from "node:os";', + 'const CONFIG_PATH = join(homedir(), ".pi", "web-search.json");', + "", + ].join("\n"); + + const patched = patchPiWebAccessSource("perplexity.ts", input); + + assert.match(patched, /FEYNMAN_WEB_SEARCH_CONFIG/); + assert.match(patched, /PI_WEB_SEARCH_CONFIG/); +}); + +test("patchPiWebAccessSource updates index.ts directory handling", () => { + const input = [ + 'import { existsSync, mkdirSync } from "node:fs";', + 'import { join } from "node:path";', + 'import { homedir } from "node:os";', + 'const WEB_SEARCH_CONFIG_PATH = join(homedir(), ".pi", "web-search.json");', + 'const dir = join(homedir(), ".pi");', + "", + ].join("\n"); + + const patched = patchPiWebAccessSource("index.ts", input); + + assert.match(patched, /import \{ dirname, join \} from "node:path";/); + assert.match(patched, /const dir = dirname\(WEB_SEARCH_CONFIG_PATH\);/); +}); + +test("patchPiWebAccessSource is idempotent", () => { + const input = [ + 'import { join } from "node:path";', + 'import { homedir } from "node:os";', + 'const CONFIG_PATH = join(homedir(), ".pi", "web-search.json");', + "", + ].join("\n"); + + const once = patchPiWebAccessSource("perplexity.ts", input); + const twice = patchPiWebAccessSource("perplexity.ts", once); + + assert.equal(twice, once); +}); diff --git a/tests/pi-web-access.test.ts b/tests/pi-web-access.test.ts index ca6bcfb..99fd123 100644 --- a/tests/pi-web-access.test.ts +++ b/tests/pi-web-access.test.ts @@ -67,6 +67,17 @@ test("getPiWebAccessStatus reads Gemini routes directly", () => { assert.equal(status.chromeProfile, "Profile 2"); }); +test("getPiWebAccessStatus supports the legacy route key", () => { + const status = getPiWebAccessStatus({ + route: "perplexity", + perplexityApiKey: "pplx_...", + }); + + assert.equal(status.routeLabel, "Perplexity"); + assert.equal(status.requestProvider, "perplexity"); + assert.equal(status.perplexityConfigured, true); +}); + test("formatPiWebAccessDoctorLines reports Pi-managed web access", () => { const lines = formatPiWebAccessDoctorLines( getPiWebAccessStatus({ diff --git a/tests/skill-paths.test.ts b/tests/skill-paths.test.ts new file mode 100644 index 0000000..3c75753 --- /dev/null +++ b/tests/skill-paths.test.ts @@ -0,0 +1,28 @@ +import test from "node:test"; +import assert from "node:assert/strict"; +import { existsSync, readdirSync, readFileSync } from "node:fs"; +import { dirname, join, resolve } from "node:path"; +import { fileURLToPath } from "node:url"; + +const repoRoot = resolve(dirname(fileURLToPath(import.meta.url)), ".."); +const skillsRoot = join(repoRoot, "skills"); +const markdownPathPattern = /`((?:\.\.?\/)(?:[A-Za-z0-9._-]+\/)*[A-Za-z0-9._-]+\.md)`/g; +const simulatedInstallRoot = join(repoRoot, "__skill-install-root__"); + +test("all local markdown references in bundled skills resolve in the installed skill layout", () => { + for (const entry of readdirSync(skillsRoot, { withFileTypes: true })) { + if (!entry.isDirectory()) continue; + + const skillPath = join(skillsRoot, entry.name, "SKILL.md"); + if (!existsSync(skillPath)) continue; + + const content = readFileSync(skillPath, "utf8"); + for (const match of content.matchAll(markdownPathPattern)) { + const reference = match[1]; + const installedSkillDir = join(simulatedInstallRoot, entry.name); + const installedTarget = resolve(installedSkillDir, reference); + const repoTarget = installedTarget.replace(simulatedInstallRoot, repoRoot); + assert.ok(existsSync(repoTarget), `${skillPath} references missing installed markdown file ${reference}`); + } + } +}); diff --git a/website/public/install-skills b/website/public/install-skills index 0e1df44..f832ae9 100644 --- a/website/public/install-skills +++ b/website/public/install-skills @@ -181,8 +181,8 @@ step "Extracting skills" tar -xzf "$archive_path" -C "$extract_dir" source_root="$(find "$extract_dir" -mindepth 1 -maxdepth 1 -type d | head -n 1)" -if [ -z "$source_root" ] || [ ! -d "$source_root/skills" ]; then - echo "Could not find skills/ in downloaded archive." >&2 +if [ -z "$source_root" ] || [ ! -d "$source_root/skills" ] || [ ! -d "$source_root/prompts" ]; then + echo "Could not find the bundled skills resources in the downloaded archive." >&2 exit 1 fi @@ -190,6 +190,10 @@ mkdir -p "$(dirname "$install_dir")" rm -rf "$install_dir" mkdir -p "$install_dir" cp -R "$source_root/skills/." "$install_dir/" +mkdir -p "$install_dir/prompts" +cp -R "$source_root/prompts/." "$install_dir/prompts/" +cp "$source_root/AGENTS.md" "$install_dir/AGENTS.md" +cp "$source_root/CONTRIBUTING.md" "$install_dir/CONTRIBUTING.md" step "Installed skills to $install_dir" case "$SCOPE" in diff --git a/website/public/install-skills.ps1 b/website/public/install-skills.ps1 index c650efa..b081451 100644 --- a/website/public/install-skills.ps1 +++ b/website/public/install-skills.ps1 @@ -92,8 +92,9 @@ try { } $skillsSource = Join-Path $sourceRoot.FullName "skills" - if (-not (Test-Path $skillsSource)) { - throw "Could not find skills/ in downloaded archive." + $promptsSource = Join-Path $sourceRoot.FullName "prompts" + if (-not (Test-Path $skillsSource) -or -not (Test-Path $promptsSource)) { + throw "Could not find the bundled skills resources in the downloaded archive." } $installParent = Split-Path $installDir -Parent @@ -107,6 +108,10 @@ try { New-Item -ItemType Directory -Path $installDir -Force | Out-Null Copy-Item -Path (Join-Path $skillsSource "*") -Destination $installDir -Recurse -Force + New-Item -ItemType Directory -Path (Join-Path $installDir "prompts") -Force | Out-Null + Copy-Item -Path (Join-Path $promptsSource "*") -Destination (Join-Path $installDir "prompts") -Recurse -Force + Copy-Item -Path (Join-Path $sourceRoot.FullName "AGENTS.md") -Destination (Join-Path $installDir "AGENTS.md") -Force + Copy-Item -Path (Join-Path $sourceRoot.FullName "CONTRIBUTING.md") -Destination (Join-Path $installDir "CONTRIBUTING.md") -Force Write-Host "==> Installed skills to $installDir" if ($Scope -eq "Repo") { diff --git a/website/public/install.ps1 b/website/public/install.ps1 index 865c0c7..06564b5 100644 --- a/website/public/install.ps1 +++ b/website/public/install.ps1 @@ -125,12 +125,18 @@ Workarounds: New-Item -ItemType Directory -Path $installBinDir -Force | Out-Null $shimPath = Join-Path $installBinDir "feynman.cmd" + $shimPs1Path = Join-Path $installBinDir "feynman.ps1" Write-Host "==> Linking feynman into $installBinDir" @" @echo off -"$bundleDir\feynman.cmd" %* +CALL "$bundleDir\feynman.cmd" %* "@ | Set-Content -Path $shimPath -Encoding ASCII + @" +`$BundleDir = "$bundleDir" +& "`$BundleDir\node\node.exe" "`$BundleDir\app\bin\feynman.js" @args +"@ | Set-Content -Path $shimPs1Path -Encoding UTF8 + $currentUserPath = [Environment]::GetEnvironmentVariable("Path", "User") $alreadyOnPath = $false if ($currentUserPath) { diff --git a/website/src/content/docs/getting-started/installation.md b/website/src/content/docs/getting-started/installation.md index d6a063e..94183e4 100644 --- a/website/src/content/docs/getting-started/installation.md +++ b/website/src/content/docs/getting-started/installation.md @@ -55,20 +55,20 @@ Or install them repo-locally: & ([scriptblock]::Create((irm https://feynman.is/install-skills.ps1))) -Scope Repo ``` -These installers download only the `skills/` tree from the Feynman repository. They do not install the Feynman terminal, bundled Node runtime, auth storage, or Pi packages. +These installers download the bundled `skills/` and `prompts/` trees plus the repo guidance files referenced by those skills. They do not install the Feynman terminal, bundled Node runtime, auth storage, or Pi packages. ## Pinned releases The one-line installer already targets the latest tagged release. To pin an exact version, pass it explicitly: ```bash -curl -fsSL https://feynman.is/install | bash -s -- 0.2.16 +curl -fsSL https://feynman.is/install | bash -s -- 0.2.17 ``` On Windows: ```powershell -& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.16 +& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.17 ``` ## pnpm