fix startup packaging and content guardrails

This commit is contained in:
Advait Paliwal
2026-04-09 10:09:05 -07:00
parent 554350cc0e
commit 3148f2e62b
39 changed files with 518 additions and 43 deletions

View File

@@ -113,3 +113,21 @@ Use this file to track chronology, not release notes. Keep entries short, factua
- Failed / learned: The remaining ValiChord PR is still stale and mixes a real prompt/skill update with unrelated branch churn; it is a review/triage item, not a clean merge candidate. - Failed / learned: The remaining ValiChord PR is still stale and mixes a real prompt/skill update with unrelated branch churn; it is a review/triage item, not a clean merge candidate.
- Blockers: No local build blockers remain; issue/PR closure still depends on the final push landing on `main`. - Blockers: No local build blockers remain; issue/PR closure still depends on the final push landing on `main`.
- Next: Push the verified cleanup commit, then close issues fixed by the dependency bump plus the new discoverability/service-tier/Windows patches, and close the stale ValiChord PR explicitly instead of leaving it open indefinitely. - Next: Push the verified cleanup commit, then close issues fixed by the dependency bump plus the new discoverability/service-tier/Windows patches, and close the stale ValiChord PR explicitly instead of leaving it open indefinitely.
### 2026-04-09 09:37 PDT — windows-startup-import-specifiers
- Objective: Fix Windows startup failures where `feynman` exits before the Pi child process initializes.
- Changed: Converted the Node preload module paths passed via `node --import` in `src/pi/launch.ts` to `file://` specifiers using a new `toNodeImportSpecifier(...)` helper in `src/pi/runtime.ts`; expanded `scripts/patch-embedded-pi.mjs` so it also patches the bundled workspace copy of Pi's extension loader when present.
- Verified: Added a regression test in `tests/pi-runtime.test.ts` covering absolute-path to `file://` conversion for preload imports; ran `npm test`, `npm run typecheck`, and `npm run build`.
- Failed / learned: The raw Windows `ERR_UNSUPPORTED_ESM_URL_SCHEME` stack is more consistent with Node rejecting the child-process `--import C:\\...` preload before Pi starts than with a normal in-app extension load failure.
- Blockers: Windows runtime execution was not available locally, so the fix is verified by code path inspection and automated tests rather than an actual Windows shell run.
- Next: Ask the affected user to reinstall or update to the next published package once released, and confirm the Windows REPL now starts from a normal PowerShell session.
### 2026-04-09 11:02 PDT — tracker-hardening-pass
- Objective: Triage the open repo backlog, land the highest-signal fixes locally, and add guardrails against stale promotional workflow content.
- Changed: Hardened Windows launch paths in `bin/feynman.js`, `scripts/build-native-bundle.mjs`, and `scripts/install/install.ps1`; set npm prefix overrides earlier in `scripts/patch-embedded-pi.mjs`; added a `pi-web-access` runtime patch helper plus `FEYNMAN_WEB_SEARCH_CONFIG` env wiring so bundled web search reads the same `~/.feynman/web-search.json` that doctor/status report; taught `src/pi/web-access.ts` to honor the legacy `route` key; fixed bundled skill references and expanded the skills-only installers/docs to ship the prompt and guidance files those skills reference; added regression tests for config paths, catalog snapshot edges, skill-path packaging, `pi-web-access` patching, and blocked promotional content.
- Verified: Ran `npm test`, `npm run typecheck`, and `npm run build` successfully after the full maintenance pass.
- Failed / learned: The skills-only install issue was not just docs drift; the shipped `SKILL.md` files referenced prompt paths that only made sense after installation, so the repo needed both path normalization and packaging changes.
- Blockers: Remote issue/PR closure and merge actions still depend on the final reviewed branch state being pushed.
- Next: Push the validated fixes, close the duplicate Windows/reporting issues they supersede, reject the promotional ValiChord PR explicitly, and then review whether the remaining docs-only or feature PRs should be merged separately.

View File

@@ -59,6 +59,7 @@ npm run build
- Avoid refactor-only PRs unless they are necessary to unblock a real fix or requested by a maintainer. - Avoid refactor-only PRs unless they are necessary to unblock a real fix or requested by a maintainer.
- Do not silently change release behavior, installer behavior, or runtime defaults without documenting the reason in the PR. - Do not silently change release behavior, installer behavior, or runtime defaults without documenting the reason in the PR.
- Use American English in docs, comments, prompts, UI copy, and examples. - Use American English in docs, comments, prompts, UI copy, and examples.
- Do not add bundled prompts, skills, or docs whose primary purpose is to market, endorse, or funnel users toward a third-party product or service. Product integrations must be justified by user-facing utility and written in neutral language.
## Repo-Specific Checks ## Repo-Specific Checks

View File

@@ -25,7 +25,7 @@ curl -fsSL https://feynman.is/install | bash
irm https://feynman.is/install.ps1 | iex irm https://feynman.is/install.ps1 | iex
``` ```
The one-line installer fetches the latest tagged release. To pin a version, pass it explicitly, for example `curl -fsSL https://feynman.is/install | bash -s -- 0.2.16`. The one-line installer fetches the latest tagged release. To pin a version, pass it explicitly, for example `curl -fsSL https://feynman.is/install | bash -s -- 0.2.17`.
If you install via `pnpm` or `bun` instead of the standalone bundle, Feynman requires Node.js `20.19.0` or newer. If you install via `pnpm` or `bun` instead of the standalone bundle, Feynman requires Node.js `20.19.0` or newer.
@@ -63,6 +63,8 @@ curl -fsSL https://feynman.is/install-skills | bash -s -- --repo
That installs into `.agents/skills/feynman` under the current repository. That installs into `.agents/skills/feynman` under the current repository.
These installers download the bundled `skills/` and `prompts/` trees plus the repo guidance files referenced by those skills. They do not install the Feynman terminal, bundled Node runtime, auth storage, or Pi packages.
--- ---
### What you type → what happens ### What you type → what happens

View File

@@ -1,4 +1,7 @@
#!/usr/bin/env node #!/usr/bin/env node
import { resolve } from "node:path";
import { pathToFileURL } from "node:url";
const MIN_NODE_VERSION = "20.19.0"; const MIN_NODE_VERSION = "20.19.0";
function parseNodeVersion(version) { function parseNodeVersion(version) {
@@ -27,5 +30,7 @@ if (compareNodeVersions(parseNodeVersion(process.versions.node), parseNodeVersio
: "curl -fsSL https://feynman.is/install | bash"); : "curl -fsSL https://feynman.is/install | bash");
process.exit(1); process.exit(1);
} }
await import(new URL("../scripts/patch-embedded-pi.mjs", import.meta.url).href); const here = import.meta.dirname;
await import(new URL("../dist/index.js", import.meta.url).href);
await import(pathToFileURL(resolve(here, "..", "scripts", "patch-embedded-pi.mjs")).href);
await import(pathToFileURL(resolve(here, "..", "dist", "index.js")).href);

4
package-lock.json generated
View File

@@ -1,12 +1,12 @@
{ {
"name": "@companion-ai/feynman", "name": "@companion-ai/feynman",
"version": "0.2.16", "version": "0.2.17",
"lockfileVersion": 3, "lockfileVersion": 3,
"requires": true, "requires": true,
"packages": { "packages": {
"": { "": {
"name": "@companion-ai/feynman", "name": "@companion-ai/feynman",
"version": "0.2.16", "version": "0.2.17",
"hasInstallScript": true, "hasInstallScript": true,
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {

View File

@@ -1,6 +1,6 @@
{ {
"name": "@companion-ai/feynman", "name": "@companion-ai/feynman",
"version": "0.2.16", "version": "0.2.17",
"description": "Research-first CLI agent built on Pi and alphaXiv", "description": "Research-first CLI agent built on Pi and alphaXiv",
"license": "MIT", "license": "MIT",
"type": "module", "type": "module",

View File

@@ -275,7 +275,8 @@ function writeLauncher(bundleRoot, target) {
"@echo off", "@echo off",
"setlocal", "setlocal",
'set "ROOT=%~dp0"', 'set "ROOT=%~dp0"',
'"%ROOT%node\\node.exe" "%ROOT%app\\bin\\feynman.js" %*', 'if "%ROOT:~-1%"=="\\" set "ROOT=%ROOT:~0,-1%"',
'"%ROOT%\\node\\node.exe" "%ROOT%\\app\\bin\\feynman.js" %*',
"", "",
].join("\r\n"), ].join("\r\n"),
"utf8", "utf8",

View File

@@ -92,8 +92,9 @@ try {
} }
$skillsSource = Join-Path $sourceRoot.FullName "skills" $skillsSource = Join-Path $sourceRoot.FullName "skills"
if (-not (Test-Path $skillsSource)) { $promptsSource = Join-Path $sourceRoot.FullName "prompts"
throw "Could not find skills/ in downloaded archive." if (-not (Test-Path $skillsSource) -or -not (Test-Path $promptsSource)) {
throw "Could not find the bundled skills resources in the downloaded archive."
} }
$installParent = Split-Path $installDir -Parent $installParent = Split-Path $installDir -Parent
@@ -107,6 +108,10 @@ try {
New-Item -ItemType Directory -Path $installDir -Force | Out-Null New-Item -ItemType Directory -Path $installDir -Force | Out-Null
Copy-Item -Path (Join-Path $skillsSource "*") -Destination $installDir -Recurse -Force Copy-Item -Path (Join-Path $skillsSource "*") -Destination $installDir -Recurse -Force
New-Item -ItemType Directory -Path (Join-Path $installDir "prompts") -Force | Out-Null
Copy-Item -Path (Join-Path $promptsSource "*") -Destination (Join-Path $installDir "prompts") -Recurse -Force
Copy-Item -Path (Join-Path $sourceRoot.FullName "AGENTS.md") -Destination (Join-Path $installDir "AGENTS.md") -Force
Copy-Item -Path (Join-Path $sourceRoot.FullName "CONTRIBUTING.md") -Destination (Join-Path $installDir "CONTRIBUTING.md") -Force
Write-Host "==> Installed skills to $installDir" Write-Host "==> Installed skills to $installDir"
if ($Scope -eq "Repo") { if ($Scope -eq "Repo") {

View File

@@ -181,8 +181,8 @@ step "Extracting skills"
tar -xzf "$archive_path" -C "$extract_dir" tar -xzf "$archive_path" -C "$extract_dir"
source_root="$(find "$extract_dir" -mindepth 1 -maxdepth 1 -type d | head -n 1)" source_root="$(find "$extract_dir" -mindepth 1 -maxdepth 1 -type d | head -n 1)"
if [ -z "$source_root" ] || [ ! -d "$source_root/skills" ]; then if [ -z "$source_root" ] || [ ! -d "$source_root/skills" ] || [ ! -d "$source_root/prompts" ]; then
echo "Could not find skills/ in downloaded archive." >&2 echo "Could not find the bundled skills resources in the downloaded archive." >&2
exit 1 exit 1
fi fi
@@ -190,6 +190,10 @@ mkdir -p "$(dirname "$install_dir")"
rm -rf "$install_dir" rm -rf "$install_dir"
mkdir -p "$install_dir" mkdir -p "$install_dir"
cp -R "$source_root/skills/." "$install_dir/" cp -R "$source_root/skills/." "$install_dir/"
mkdir -p "$install_dir/prompts"
cp -R "$source_root/prompts/." "$install_dir/prompts/"
cp "$source_root/AGENTS.md" "$install_dir/AGENTS.md"
cp "$source_root/CONTRIBUTING.md" "$install_dir/CONTRIBUTING.md"
step "Installed skills to $install_dir" step "Installed skills to $install_dir"
case "$SCOPE" in case "$SCOPE" in

View File

@@ -125,12 +125,18 @@ Workarounds:
New-Item -ItemType Directory -Path $installBinDir -Force | Out-Null New-Item -ItemType Directory -Path $installBinDir -Force | Out-Null
$shimPath = Join-Path $installBinDir "feynman.cmd" $shimPath = Join-Path $installBinDir "feynman.cmd"
$shimPs1Path = Join-Path $installBinDir "feynman.ps1"
Write-Host "==> Linking feynman into $installBinDir" Write-Host "==> Linking feynman into $installBinDir"
@" @"
@echo off @echo off
"$bundleDir\feynman.cmd" %* CALL "$bundleDir\feynman.cmd" %*
"@ | Set-Content -Path $shimPath -Encoding ASCII "@ | Set-Content -Path $shimPath -Encoding ASCII
@"
`$BundleDir = "$bundleDir"
& "`$BundleDir\node\node.exe" "`$BundleDir\app\bin\feynman.js" @args
"@ | Set-Content -Path $shimPs1Path -Encoding UTF8
$currentUserPath = [Environment]::GetEnvironmentVariable("Path", "User") $currentUserPath = [Environment]::GetEnvironmentVariable("Path", "User")
$alreadyOnPath = $false $alreadyOnPath = $false
if ($currentUserPath) { if ($currentUserPath) {

View File

@@ -0,0 +1,2 @@
export const PI_WEB_ACCESS_PATCH_TARGETS: string[];
export function patchPiWebAccessSource(relativePath: string, source: string): string;

View File

@@ -0,0 +1,32 @@
export const PI_WEB_ACCESS_PATCH_TARGETS = [
"index.ts",
"exa.ts",
"gemini-api.ts",
"gemini-search.ts",
"gemini-web.ts",
"github-extract.ts",
"perplexity.ts",
"video-extract.ts",
"youtube-extract.ts",
];
const LEGACY_CONFIG_EXPR = 'join(homedir(), ".pi", "web-search.json")';
const PATCHED_CONFIG_EXPR =
'process.env.FEYNMAN_WEB_SEARCH_CONFIG ?? process.env.PI_WEB_SEARCH_CONFIG ?? join(homedir(), ".pi", "web-search.json")';
export function patchPiWebAccessSource(relativePath, source) {
let patched = source;
if (patched.includes(PATCHED_CONFIG_EXPR)) {
return patched;
}
patched = patched.split(LEGACY_CONFIG_EXPR).join(PATCHED_CONFIG_EXPR);
if (relativePath === "index.ts" && patched !== source) {
patched = patched.replace('import { join } from "node:path";', 'import { dirname, join } from "node:path";');
patched = patched.replace('const dir = join(homedir(), ".pi");', "const dir = dirname(WEB_SEARCH_CONFIG_PATH);");
}
return patched;
}

View File

@@ -1,14 +1,21 @@
import { spawnSync } from "node:child_process"; import { spawnSync } from "node:child_process";
import { existsSync, mkdirSync, readFileSync, rmSync, writeFileSync } from "node:fs"; import { existsSync, mkdirSync, readFileSync, rmSync, writeFileSync } from "node:fs";
import { createRequire } from "node:module"; import { createRequire } from "node:module";
import { homedir } from "node:os";
import { dirname, resolve } from "node:path"; import { dirname, resolve } from "node:path";
import { fileURLToPath } from "node:url"; import { fileURLToPath } from "node:url";
import { FEYNMAN_LOGO_HTML } from "../logo.mjs"; import { FEYNMAN_LOGO_HTML } from "../logo.mjs";
import { patchPiExtensionLoaderSource } from "./lib/pi-extension-loader-patch.mjs"; import { patchPiExtensionLoaderSource } from "./lib/pi-extension-loader-patch.mjs";
import { PI_WEB_ACCESS_PATCH_TARGETS, patchPiWebAccessSource } from "./lib/pi-web-access-patch.mjs";
import { PI_SUBAGENTS_PATCH_TARGETS, patchPiSubagentsSource } from "./lib/pi-subagents-patch.mjs"; import { PI_SUBAGENTS_PATCH_TARGETS, patchPiSubagentsSource } from "./lib/pi-subagents-patch.mjs";
const here = dirname(fileURLToPath(import.meta.url)); const here = dirname(fileURLToPath(import.meta.url));
const appRoot = resolve(here, ".."); const appRoot = resolve(here, "..");
const feynmanHome = resolve(process.env.FEYNMAN_HOME ?? homedir(), ".feynman");
const feynmanNpmPrefix = resolve(feynmanHome, "npm-global");
process.env.FEYNMAN_NPM_PREFIX = feynmanNpmPrefix;
process.env.NPM_CONFIG_PREFIX = feynmanNpmPrefix;
process.env.npm_config_prefix = feynmanNpmPrefix;
const appRequire = createRequire(resolve(appRoot, "package.json")); const appRequire = createRequire(resolve(appRoot, "package.json"));
const isGlobalInstall = process.env.npm_config_global === "true" || process.env.npm_config_location === "global"; const isGlobalInstall = process.env.npm_config_global === "true" || process.env.npm_config_location === "global";
@@ -57,6 +64,15 @@ const extensionLoaderPath = piPackageRoot ? resolve(piPackageRoot, "dist", "core
const terminalPath = piTuiRoot ? resolve(piTuiRoot, "dist", "terminal.js") : null; const terminalPath = piTuiRoot ? resolve(piTuiRoot, "dist", "terminal.js") : null;
const editorPath = piTuiRoot ? resolve(piTuiRoot, "dist", "components", "editor.js") : null; const editorPath = piTuiRoot ? resolve(piTuiRoot, "dist", "components", "editor.js") : null;
const workspaceRoot = resolve(appRoot, ".feynman", "npm", "node_modules"); const workspaceRoot = resolve(appRoot, ".feynman", "npm", "node_modules");
const workspaceExtensionLoaderPath = resolve(
workspaceRoot,
"@mariozechner",
"pi-coding-agent",
"dist",
"core",
"extensions",
"loader.js",
);
const vendorOverrideRoot = resolve(appRoot, ".feynman", "vendor-overrides"); const vendorOverrideRoot = resolve(appRoot, ".feynman", "vendor-overrides");
const piSubagentsRoot = resolve(workspaceRoot, "pi-subagents"); const piSubagentsRoot = resolve(workspaceRoot, "pi-subagents");
const webAccessPath = resolve(workspaceRoot, "pi-web-access", "index.ts"); const webAccessPath = resolve(workspaceRoot, "pi-web-access", "index.ts");
@@ -76,7 +92,17 @@ const workspaceArchivePath = resolve(appRoot, ".feynman", "runtime-workspace.tgz
function createInstallCommand(packageManager, packageSpecs) { function createInstallCommand(packageManager, packageSpecs) {
switch (packageManager) { switch (packageManager) {
case "npm": case "npm":
return ["install", "--prefer-offline", "--no-audit", "--no-fund", "--loglevel", "error", ...packageSpecs]; return [
"install",
"--global=false",
"--location=project",
"--prefer-offline",
"--no-audit",
"--no-fund",
"--loglevel",
"error",
...packageSpecs,
];
case "pnpm": case "pnpm":
return ["add", "--prefer-offline", "--reporter", "silent", ...packageSpecs]; return ["add", "--prefer-offline", "--reporter", "silent", ...packageSpecs];
case "bun": case "bun":
@@ -367,11 +393,15 @@ if (interactiveModePath && existsSync(interactiveModePath)) {
} }
} }
if (extensionLoaderPath && existsSync(extensionLoaderPath)) { for (const loaderPath of [extensionLoaderPath, workspaceExtensionLoaderPath].filter(Boolean)) {
const source = readFileSync(extensionLoaderPath, "utf8"); if (!existsSync(loaderPath)) {
continue;
}
const source = readFileSync(loaderPath, "utf8");
const patched = patchPiExtensionLoaderSource(source); const patched = patchPiExtensionLoaderSource(source);
if (patched !== source) { if (patched !== source) {
writeFileSync(extensionLoaderPath, patched, "utf8"); writeFileSync(loaderPath, patched, "utf8");
} }
} }
@@ -560,6 +590,21 @@ if (existsSync(webAccessPath)) {
} }
} }
const piWebAccessRoot = resolve(workspaceRoot, "pi-web-access");
if (existsSync(piWebAccessRoot)) {
for (const relativePath of PI_WEB_ACCESS_PATCH_TARGETS) {
const entryPath = resolve(piWebAccessRoot, relativePath);
if (!existsSync(entryPath)) continue;
const source = readFileSync(entryPath, "utf8");
const patched = patchPiWebAccessSource(relativePath, source);
if (patched !== source) {
writeFileSync(entryPath, patched, "utf8");
}
}
}
if (existsSync(sessionSearchIndexerPath)) { if (existsSync(sessionSearchIndexerPath)) {
const source = readFileSync(sessionSearchIndexerPath, "utf8"); const source = readFileSync(sessionSearchIndexerPath, "utf8");
const original = 'const sessionsDir = path.join(os.homedir(), ".pi", "agent", "sessions");'; const original = 'const sessionsDir = path.join(os.homedir(), ".pi", "agent", "sessions");';

View File

@@ -5,7 +5,7 @@ description: Autonomous experiment loop that tries ideas, measures results, keep
# Autoresearch # Autoresearch
Run the `/autoresearch` workflow. Read the prompt template at `prompts/autoresearch.md` for the full procedure. Run the `/autoresearch` workflow. Read the prompt template at `../prompts/autoresearch.md` for the full procedure.
Tools used: `init_experiment`, `run_experiment`, `log_experiment` (from pi-autoresearch) Tools used: `init_experiment`, `run_experiment`, `log_experiment` (from pi-autoresearch)

View File

@@ -5,7 +5,7 @@ description: Contribute changes to the Feynman repository itself. Use when the t
# Contributing # Contributing
Read `CONTRIBUTING.md` first, then `AGENTS.md` for repo-level agent conventions. Read `../CONTRIBUTING.md` first, then `../AGENTS.md` for repo-level agent conventions.
Use this skill when working on Feynman itself, especially for: Use this skill when working on Feynman itself, especially for:

View File

@@ -5,7 +5,7 @@ description: Run a thorough, source-heavy investigation on any topic. Use when t
# Deep Research # Deep Research
Run the `/deepresearch` workflow. Read the prompt template at `prompts/deepresearch.md` for the full procedure. Run the `/deepresearch` workflow. Read the prompt template at `../prompts/deepresearch.md` for the full procedure.
Agents used: `researcher`, `verifier`, `reviewer` Agents used: `researcher`, `verifier`, `reviewer`

View File

@@ -5,6 +5,6 @@ description: Inspect active background research work including running processes
# Jobs # Jobs
Run the `/jobs` workflow. Read the prompt template at `prompts/jobs.md` for the full procedure. Run the `/jobs` workflow. Read the prompt template at `../prompts/jobs.md` for the full procedure.
Shows active `pi-processes`, scheduled `pi-schedule-prompt` entries, and running subagent tasks. Shows active `pi-processes`, scheduled `pi-schedule-prompt` entries, and running subagent tasks.

View File

@@ -5,7 +5,7 @@ description: Run a literature review using paper search and primary-source synth
# Literature Review # Literature Review
Run the `/lit` workflow. Read the prompt template at `prompts/lit.md` for the full procedure. Run the `/lit` workflow. Read the prompt template at `../prompts/lit.md` for the full procedure.
Agents used: `researcher`, `verifier`, `reviewer` Agents used: `researcher`, `verifier`, `reviewer`

View File

@@ -5,7 +5,7 @@ description: Compare a paper's claims against its public codebase. Use when the
# Paper-Code Audit # Paper-Code Audit
Run the `/audit` workflow. Read the prompt template at `prompts/audit.md` for the full procedure. Run the `/audit` workflow. Read the prompt template at `../prompts/audit.md` for the full procedure.
Agents used: `researcher`, `verifier` Agents used: `researcher`, `verifier`

View File

@@ -5,7 +5,7 @@ description: Turn research findings into a polished paper-style draft with secti
# Paper Writing # Paper Writing
Run the `/draft` workflow. Read the prompt template at `prompts/draft.md` for the full procedure. Run the `/draft` workflow. Read the prompt template at `../prompts/draft.md` for the full procedure.
Agents used: `writer`, `verifier` Agents used: `writer`, `verifier`

View File

@@ -5,7 +5,7 @@ description: Simulate a tough but constructive peer review of an AI research art
# Peer Review # Peer Review
Run the `/review` workflow. Read the prompt template at `prompts/review.md` for the full procedure. Run the `/review` workflow. Read the prompt template at `../prompts/review.md` for the full procedure.
Agents used: `researcher`, `reviewer` Agents used: `researcher`, `reviewer`

View File

@@ -5,7 +5,7 @@ description: Plan or execute a replication of a paper, claim, or benchmark. Use
# Replication # Replication
Run the `/replicate` workflow. Read the prompt template at `prompts/replicate.md` for the full procedure. Run the `/replicate` workflow. Read the prompt template at `../prompts/replicate.md` for the full procedure.
Agents used: `researcher` Agents used: `researcher`

View File

@@ -5,6 +5,6 @@ description: Write a durable session log capturing completed work, findings, ope
# Session Log # Session Log
Run the `/log` workflow. Read the prompt template at `prompts/log.md` for the full procedure. Run the `/log` workflow. Read the prompt template at `../prompts/log.md` for the full procedure.
Output: session log in `notes/session-logs/`. Output: session log in `notes/session-logs/`.

View File

@@ -5,7 +5,7 @@ description: Compare multiple sources on a topic and produce a grounded comparis
# Source Comparison # Source Comparison
Run the `/compare` workflow. Read the prompt template at `prompts/compare.md` for the full procedure. Run the `/compare` workflow. Read the prompt template at `../prompts/compare.md` for the full procedure.
Agents used: `researcher`, `verifier` Agents used: `researcher`, `verifier`

View File

@@ -5,7 +5,7 @@ description: Set up a recurring research watch on a topic, company, paper area,
# Watch # Watch
Run the `/watch` workflow. Read the prompt template at `prompts/watch.md` for the full procedure. Run the `/watch` workflow. Read the prompt template at `../prompts/watch.md` for the full procedure.
Agents used: `researcher` Agents used: `researcher`

View File

@@ -1,7 +1,7 @@
import { spawn } from "node:child_process"; import { spawn } from "node:child_process";
import { existsSync } from "node:fs"; import { existsSync } from "node:fs";
import { buildPiArgs, buildPiEnv, type PiRuntimeOptions, resolvePiPaths } from "./runtime.js"; import { buildPiArgs, buildPiEnv, type PiRuntimeOptions, resolvePiPaths, toNodeImportSpecifier } from "./runtime.js";
import { ensureSupportedNodeVersion } from "../system/node-version.js"; import { ensureSupportedNodeVersion } from "../system/node-version.js";
export async function launchPiChat(options: PiRuntimeOptions): Promise<void> { export async function launchPiChat(options: PiRuntimeOptions): Promise<void> {
@@ -23,8 +23,8 @@ export async function launchPiChat(options: PiRuntimeOptions): Promise<void> {
} }
const importArgs = useDevPolyfill const importArgs = useDevPolyfill
? ["--import", tsxLoaderPath, "--import", promisePolyfillSourcePath] ? ["--import", toNodeImportSpecifier(tsxLoaderPath), "--import", toNodeImportSpecifier(promisePolyfillSourcePath)]
: ["--import", promisePolyfillPath]; : ["--import", toNodeImportSpecifier(promisePolyfillPath)];
const child = spawn(process.execPath, [...importArgs, piCliPath, ...buildPiArgs(options)], { const child = spawn(process.execPath, [...importArgs, piCliPath, ...buildPiArgs(options)], {
cwd: options.workingDir, cwd: options.workingDir,

View File

@@ -1,5 +1,6 @@
import { existsSync, readFileSync } from "node:fs"; import { existsSync, readFileSync } from "node:fs";
import { delimiter, dirname, resolve } from "node:path"; import { delimiter, dirname, isAbsolute, resolve } from "node:path";
import { pathToFileURL } from "node:url";
import { import {
BROWSER_FALLBACK_PATHS, BROWSER_FALLBACK_PATHS,
@@ -47,6 +48,10 @@ export function resolvePiPaths(appRoot: string) {
}; };
} }
export function toNodeImportSpecifier(modulePath: string): string {
return isAbsolute(modulePath) ? pathToFileURL(modulePath).href : modulePath;
}
export function validatePiInstallation(appRoot: string): string[] { export function validatePiInstallation(appRoot: string): string[] {
const paths = resolvePiPaths(appRoot); const paths = resolvePiPaths(appRoot);
const missing: string[] = []; const missing: string[] = [];
@@ -97,6 +102,7 @@ export function buildPiEnv(options: PiRuntimeOptions): NodeJS.ProcessEnv {
const paths = resolvePiPaths(options.appRoot); const paths = resolvePiPaths(options.appRoot);
const feynmanNpmPrefixPath = getFeynmanNpmPrefixPath(options.feynmanAgentDir); const feynmanNpmPrefixPath = getFeynmanNpmPrefixPath(options.feynmanAgentDir);
const feynmanNpmBinPath = resolve(feynmanNpmPrefixPath, "bin"); const feynmanNpmBinPath = resolve(feynmanNpmPrefixPath, "bin");
const feynmanWebSearchConfigPath = resolve(dirname(options.feynmanAgentDir), "web-search.json");
const currentPath = process.env.PATH ?? ""; const currentPath = process.env.PATH ?? "";
const binEntries = [paths.nodeModulesBinPath, resolve(paths.piWorkspaceNodeModulesPath, ".bin"), feynmanNpmBinPath]; const binEntries = [paths.nodeModulesBinPath, resolve(paths.piWorkspaceNodeModulesPath, ".bin"), feynmanNpmBinPath];
@@ -108,6 +114,7 @@ export function buildPiEnv(options: PiRuntimeOptions): NodeJS.ProcessEnv {
FEYNMAN_VERSION: options.feynmanVersion, FEYNMAN_VERSION: options.feynmanVersion,
FEYNMAN_SESSION_DIR: options.sessionDir, FEYNMAN_SESSION_DIR: options.sessionDir,
FEYNMAN_MEMORY_DIR: resolve(dirname(options.feynmanAgentDir), "memory"), FEYNMAN_MEMORY_DIR: resolve(dirname(options.feynmanAgentDir), "memory"),
FEYNMAN_WEB_SEARCH_CONFIG: feynmanWebSearchConfigPath,
FEYNMAN_NODE_EXECUTABLE: process.execPath, FEYNMAN_NODE_EXECUTABLE: process.execPath,
FEYNMAN_BIN_PATH: resolve(options.appRoot, "bin", "feynman.js"), FEYNMAN_BIN_PATH: resolve(options.appRoot, "bin", "feynman.js"),
FEYNMAN_NPM_PREFIX: feynmanNpmPrefixPath, FEYNMAN_NPM_PREFIX: feynmanNpmPrefixPath,

View File

@@ -5,6 +5,7 @@ import { resolve } from "node:path";
export type PiWebSearchProvider = "auto" | "perplexity" | "exa" | "gemini"; export type PiWebSearchProvider = "auto" | "perplexity" | "exa" | "gemini";
export type PiWebAccessConfig = Record<string, unknown> & { export type PiWebAccessConfig = Record<string, unknown> & {
route?: PiWebSearchProvider;
provider?: PiWebSearchProvider; provider?: PiWebSearchProvider;
searchProvider?: PiWebSearchProvider; searchProvider?: PiWebSearchProvider;
perplexityApiKey?: string; perplexityApiKey?: string;
@@ -80,8 +81,9 @@ export function getPiWebAccessStatus(
config: PiWebAccessConfig = loadPiWebAccessConfig(), config: PiWebAccessConfig = loadPiWebAccessConfig(),
configPath = getPiWebSearchConfigPath(), configPath = getPiWebSearchConfigPath(),
): PiWebAccessStatus { ): PiWebAccessStatus {
const searchProvider = normalizeProvider(config.searchProvider) ?? "auto"; const searchProvider =
const requestProvider = normalizeProvider(config.provider) ?? searchProvider; normalizeProvider(config.searchProvider) ?? normalizeProvider(config.route) ?? normalizeProvider(config.provider) ?? "auto";
const requestProvider = normalizeProvider(config.provider) ?? normalizeProvider(config.route) ?? searchProvider;
const perplexityConfigured = Boolean(normalizeNonEmptyString(config.perplexityApiKey)); const perplexityConfigured = Boolean(normalizeNonEmptyString(config.perplexityApiKey));
const exaConfigured = Boolean(normalizeNonEmptyString(config.exaApiKey)); const exaConfigured = Boolean(normalizeNonEmptyString(config.exaApiKey));
const geminiApiConfigured = Boolean(normalizeNonEmptyString(config.geminiApiKey)); const geminiApiConfigured = Boolean(normalizeNonEmptyString(config.geminiApiKey));

View File

@@ -0,0 +1,110 @@
import test from "node:test";
import assert from "node:assert/strict";
import { buildModelStatusSnapshotFromRecords } from "../src/model/catalog.js";
test("buildModelStatusSnapshotFromRecords returns empty guidance when model is set and valid", () => {
const snapshot = buildModelStatusSnapshotFromRecords(
[{ provider: "anthropic", id: "claude-opus-4-6" }],
[{ provider: "anthropic", id: "claude-opus-4-6" }],
"anthropic/claude-opus-4-6",
);
assert.equal(snapshot.currentValid, true);
assert.equal(snapshot.current, "anthropic/claude-opus-4-6");
assert.equal(snapshot.guidance.length, 0);
});
test("buildModelStatusSnapshotFromRecords emits guidance when no models are available", () => {
const snapshot = buildModelStatusSnapshotFromRecords([], [], undefined);
assert.equal(snapshot.currentValid, false);
assert.equal(snapshot.current, undefined);
assert.equal(snapshot.recommended, undefined);
assert.ok(snapshot.guidance.some((line) => line.includes("No authenticated Pi models")));
});
test("buildModelStatusSnapshotFromRecords emits guidance when no default model is set", () => {
const snapshot = buildModelStatusSnapshotFromRecords(
[{ provider: "openai", id: "gpt-5.4" }],
[{ provider: "openai", id: "gpt-5.4" }],
undefined,
);
assert.equal(snapshot.currentValid, false);
assert.equal(snapshot.current, undefined);
assert.ok(snapshot.guidance.some((line) => line.includes("No default research model")));
});
test("buildModelStatusSnapshotFromRecords marks provider as configured only when it has available models", () => {
const snapshot = buildModelStatusSnapshotFromRecords(
[
{ provider: "anthropic", id: "claude-opus-4-6" },
{ provider: "openai", id: "gpt-5.4" },
],
[{ provider: "openai", id: "gpt-5.4" }],
"openai/gpt-5.4",
);
const anthropicProvider = snapshot.providers.find((provider) => provider.id === "anthropic");
const openaiProvider = snapshot.providers.find((provider) => provider.id === "openai");
assert.ok(anthropicProvider);
assert.equal(anthropicProvider!.configured, false);
assert.equal(anthropicProvider!.supportedModels, 1);
assert.equal(anthropicProvider!.availableModels, 0);
assert.ok(openaiProvider);
assert.equal(openaiProvider!.configured, true);
assert.equal(openaiProvider!.supportedModels, 1);
assert.equal(openaiProvider!.availableModels, 1);
});
test("buildModelStatusSnapshotFromRecords marks provider as current when selected model belongs to it", () => {
const snapshot = buildModelStatusSnapshotFromRecords(
[
{ provider: "anthropic", id: "claude-opus-4-6" },
{ provider: "openai", id: "gpt-5.4" },
],
[
{ provider: "anthropic", id: "claude-opus-4-6" },
{ provider: "openai", id: "gpt-5.4" },
],
"anthropic/claude-opus-4-6",
);
const anthropicProvider = snapshot.providers.find((provider) => provider.id === "anthropic");
const openaiProvider = snapshot.providers.find((provider) => provider.id === "openai");
assert.equal(anthropicProvider!.current, true);
assert.equal(openaiProvider!.current, false);
});
test("buildModelStatusSnapshotFromRecords returns available models sorted by research preference", () => {
const snapshot = buildModelStatusSnapshotFromRecords(
[
{ provider: "openai", id: "gpt-5.4" },
{ provider: "anthropic", id: "claude-opus-4-6" },
],
[
{ provider: "openai", id: "gpt-5.4" },
{ provider: "anthropic", id: "claude-opus-4-6" },
],
undefined,
);
assert.equal(snapshot.availableModels[0], "anthropic/claude-opus-4-6");
assert.equal(snapshot.availableModels[1], "openai/gpt-5.4");
assert.equal(snapshot.recommended, "anthropic/claude-opus-4-6");
});
test("buildModelStatusSnapshotFromRecords sets currentValid false when current model is not in available list", () => {
const snapshot = buildModelStatusSnapshotFromRecords(
[{ provider: "anthropic", id: "claude-opus-4-6" }],
[],
"anthropic/claude-opus-4-6",
);
assert.equal(snapshot.currentValid, false);
assert.equal(snapshot.current, "anthropic/claude-opus-4-6");
});

View File

@@ -0,0 +1,92 @@
import test from "node:test";
import assert from "node:assert/strict";
import { existsSync, mkdtempSync, rmSync } from "node:fs";
import { tmpdir } from "node:os";
import { join, resolve } from "node:path";
import {
ensureFeynmanHome,
getBootstrapStatePath,
getDefaultSessionDir,
getFeynmanAgentDir,
getFeynmanHome,
getFeynmanMemoryDir,
getFeynmanStateDir,
} from "../src/config/paths.js";
test("getFeynmanHome uses FEYNMAN_HOME env var when set", () => {
const previous = process.env.FEYNMAN_HOME;
try {
process.env.FEYNMAN_HOME = "/custom/home";
assert.equal(getFeynmanHome(), resolve("/custom/home", ".feynman"));
} finally {
if (previous === undefined) {
delete process.env.FEYNMAN_HOME;
} else {
process.env.FEYNMAN_HOME = previous;
}
}
});
test("getFeynmanHome falls back to homedir when FEYNMAN_HOME is unset", () => {
const previous = process.env.FEYNMAN_HOME;
try {
delete process.env.FEYNMAN_HOME;
const home = getFeynmanHome();
assert.ok(home.endsWith(".feynman"), `expected path ending in .feynman, got: ${home}`);
assert.ok(!home.includes("undefined"), `expected no 'undefined' in path, got: ${home}`);
} finally {
if (previous === undefined) {
delete process.env.FEYNMAN_HOME;
} else {
process.env.FEYNMAN_HOME = previous;
}
}
});
test("getFeynmanAgentDir resolves to <home>/agent", () => {
assert.equal(getFeynmanAgentDir("/some/home"), resolve("/some/home", "agent"));
});
test("getFeynmanMemoryDir resolves to <home>/memory", () => {
assert.equal(getFeynmanMemoryDir("/some/home"), resolve("/some/home", "memory"));
});
test("getFeynmanStateDir resolves to <home>/.state", () => {
assert.equal(getFeynmanStateDir("/some/home"), resolve("/some/home", ".state"));
});
test("getDefaultSessionDir resolves to <home>/sessions", () => {
assert.equal(getDefaultSessionDir("/some/home"), resolve("/some/home", "sessions"));
});
test("getBootstrapStatePath resolves to <home>/.state/bootstrap.json", () => {
assert.equal(getBootstrapStatePath("/some/home"), resolve("/some/home", ".state", "bootstrap.json"));
});
test("ensureFeynmanHome creates all required subdirectories", () => {
const root = mkdtempSync(join(tmpdir(), "feynman-paths-"));
try {
const home = join(root, "home");
ensureFeynmanHome(home);
assert.ok(existsSync(home), "home dir should exist");
assert.ok(existsSync(join(home, "agent")), "agent dir should exist");
assert.ok(existsSync(join(home, "memory")), "memory dir should exist");
assert.ok(existsSync(join(home, ".state")), ".state dir should exist");
assert.ok(existsSync(join(home, "sessions")), "sessions dir should exist");
} finally {
rmSync(root, { recursive: true, force: true });
}
});
test("ensureFeynmanHome is idempotent when dirs already exist", () => {
const root = mkdtempSync(join(tmpdir(), "feynman-paths-"));
try {
const home = join(root, "home");
ensureFeynmanHome(home);
assert.doesNotThrow(() => ensureFeynmanHome(home));
} finally {
rmSync(root, { recursive: true, force: true });
}
});

View File

@@ -0,0 +1,32 @@
import test from "node:test";
import assert from "node:assert/strict";
import { readdirSync, readFileSync } from "node:fs";
import { dirname, join, resolve } from "node:path";
import { fileURLToPath } from "node:url";
const repoRoot = resolve(dirname(fileURLToPath(import.meta.url)), "..");
const bannedPatterns = [/ValiChord/i, /Harmony Record/i, /harmony_record_/i];
function collectMarkdownFiles(root: string): string[] {
const files: string[] = [];
for (const entry of readdirSync(root, { withFileTypes: true })) {
const fullPath = join(root, entry.name);
if (entry.isDirectory()) {
files.push(...collectMarkdownFiles(fullPath));
continue;
}
if (entry.isFile() && fullPath.endsWith(".md")) {
files.push(fullPath);
}
}
return files;
}
test("bundled prompts and skills do not contain blocked promotional product content", () => {
for (const filePath of [...collectMarkdownFiles(join(repoRoot, "prompts")), ...collectMarkdownFiles(join(repoRoot, "skills"))]) {
const content = readFileSync(filePath, "utf8");
for (const pattern of bannedPatterns) {
assert.doesNotMatch(content, pattern, `${filePath} contains blocked promotional pattern ${pattern}`);
}
}
});

View File

@@ -1,7 +1,8 @@
import test from "node:test"; import test from "node:test";
import assert from "node:assert/strict"; import assert from "node:assert/strict";
import { pathToFileURL } from "node:url";
import { applyFeynmanPackageManagerEnv, buildPiArgs, buildPiEnv, resolvePiPaths } from "../src/pi/runtime.js"; import { applyFeynmanPackageManagerEnv, buildPiArgs, buildPiEnv, resolvePiPaths, toNodeImportSpecifier } from "../src/pi/runtime.js";
test("buildPiArgs includes configured runtime paths and prompt", () => { test("buildPiArgs includes configured runtime paths and prompt", () => {
const args = buildPiArgs({ const args = buildPiArgs({
@@ -106,3 +107,11 @@ test("resolvePiPaths includes the Promise.withResolvers polyfill path", () => {
assert.equal(paths.promisePolyfillPath, "/repo/feynman/dist/system/promise-polyfill.js"); assert.equal(paths.promisePolyfillPath, "/repo/feynman/dist/system/promise-polyfill.js");
}); });
test("toNodeImportSpecifier converts absolute preload paths to file URLs", () => {
assert.equal(
toNodeImportSpecifier("/repo/feynman/dist/system/promise-polyfill.js"),
pathToFileURL("/repo/feynman/dist/system/promise-polyfill.js").href,
);
assert.equal(toNodeImportSpecifier("tsx"), "tsx");
});

View File

@@ -0,0 +1,48 @@
import test from "node:test";
import assert from "node:assert/strict";
import { patchPiWebAccessSource } from "../scripts/lib/pi-web-access-patch.mjs";
test("patchPiWebAccessSource rewrites legacy Pi web-search config paths", () => {
const input = [
'import { join } from "node:path";',
'import { homedir } from "node:os";',
'const CONFIG_PATH = join(homedir(), ".pi", "web-search.json");',
"",
].join("\n");
const patched = patchPiWebAccessSource("perplexity.ts", input);
assert.match(patched, /FEYNMAN_WEB_SEARCH_CONFIG/);
assert.match(patched, /PI_WEB_SEARCH_CONFIG/);
});
test("patchPiWebAccessSource updates index.ts directory handling", () => {
const input = [
'import { existsSync, mkdirSync } from "node:fs";',
'import { join } from "node:path";',
'import { homedir } from "node:os";',
'const WEB_SEARCH_CONFIG_PATH = join(homedir(), ".pi", "web-search.json");',
'const dir = join(homedir(), ".pi");',
"",
].join("\n");
const patched = patchPiWebAccessSource("index.ts", input);
assert.match(patched, /import \{ dirname, join \} from "node:path";/);
assert.match(patched, /const dir = dirname\(WEB_SEARCH_CONFIG_PATH\);/);
});
test("patchPiWebAccessSource is idempotent", () => {
const input = [
'import { join } from "node:path";',
'import { homedir } from "node:os";',
'const CONFIG_PATH = join(homedir(), ".pi", "web-search.json");',
"",
].join("\n");
const once = patchPiWebAccessSource("perplexity.ts", input);
const twice = patchPiWebAccessSource("perplexity.ts", once);
assert.equal(twice, once);
});

View File

@@ -67,6 +67,17 @@ test("getPiWebAccessStatus reads Gemini routes directly", () => {
assert.equal(status.chromeProfile, "Profile 2"); assert.equal(status.chromeProfile, "Profile 2");
}); });
test("getPiWebAccessStatus supports the legacy route key", () => {
const status = getPiWebAccessStatus({
route: "perplexity",
perplexityApiKey: "pplx_...",
});
assert.equal(status.routeLabel, "Perplexity");
assert.equal(status.requestProvider, "perplexity");
assert.equal(status.perplexityConfigured, true);
});
test("formatPiWebAccessDoctorLines reports Pi-managed web access", () => { test("formatPiWebAccessDoctorLines reports Pi-managed web access", () => {
const lines = formatPiWebAccessDoctorLines( const lines = formatPiWebAccessDoctorLines(
getPiWebAccessStatus({ getPiWebAccessStatus({

28
tests/skill-paths.test.ts Normal file
View File

@@ -0,0 +1,28 @@
import test from "node:test";
import assert from "node:assert/strict";
import { existsSync, readdirSync, readFileSync } from "node:fs";
import { dirname, join, resolve } from "node:path";
import { fileURLToPath } from "node:url";
const repoRoot = resolve(dirname(fileURLToPath(import.meta.url)), "..");
const skillsRoot = join(repoRoot, "skills");
const markdownPathPattern = /`((?:\.\.?\/)(?:[A-Za-z0-9._-]+\/)*[A-Za-z0-9._-]+\.md)`/g;
const simulatedInstallRoot = join(repoRoot, "__skill-install-root__");
test("all local markdown references in bundled skills resolve in the installed skill layout", () => {
for (const entry of readdirSync(skillsRoot, { withFileTypes: true })) {
if (!entry.isDirectory()) continue;
const skillPath = join(skillsRoot, entry.name, "SKILL.md");
if (!existsSync(skillPath)) continue;
const content = readFileSync(skillPath, "utf8");
for (const match of content.matchAll(markdownPathPattern)) {
const reference = match[1];
const installedSkillDir = join(simulatedInstallRoot, entry.name);
const installedTarget = resolve(installedSkillDir, reference);
const repoTarget = installedTarget.replace(simulatedInstallRoot, repoRoot);
assert.ok(existsSync(repoTarget), `${skillPath} references missing installed markdown file ${reference}`);
}
}
});

View File

@@ -181,8 +181,8 @@ step "Extracting skills"
tar -xzf "$archive_path" -C "$extract_dir" tar -xzf "$archive_path" -C "$extract_dir"
source_root="$(find "$extract_dir" -mindepth 1 -maxdepth 1 -type d | head -n 1)" source_root="$(find "$extract_dir" -mindepth 1 -maxdepth 1 -type d | head -n 1)"
if [ -z "$source_root" ] || [ ! -d "$source_root/skills" ]; then if [ -z "$source_root" ] || [ ! -d "$source_root/skills" ] || [ ! -d "$source_root/prompts" ]; then
echo "Could not find skills/ in downloaded archive." >&2 echo "Could not find the bundled skills resources in the downloaded archive." >&2
exit 1 exit 1
fi fi
@@ -190,6 +190,10 @@ mkdir -p "$(dirname "$install_dir")"
rm -rf "$install_dir" rm -rf "$install_dir"
mkdir -p "$install_dir" mkdir -p "$install_dir"
cp -R "$source_root/skills/." "$install_dir/" cp -R "$source_root/skills/." "$install_dir/"
mkdir -p "$install_dir/prompts"
cp -R "$source_root/prompts/." "$install_dir/prompts/"
cp "$source_root/AGENTS.md" "$install_dir/AGENTS.md"
cp "$source_root/CONTRIBUTING.md" "$install_dir/CONTRIBUTING.md"
step "Installed skills to $install_dir" step "Installed skills to $install_dir"
case "$SCOPE" in case "$SCOPE" in

View File

@@ -92,8 +92,9 @@ try {
} }
$skillsSource = Join-Path $sourceRoot.FullName "skills" $skillsSource = Join-Path $sourceRoot.FullName "skills"
if (-not (Test-Path $skillsSource)) { $promptsSource = Join-Path $sourceRoot.FullName "prompts"
throw "Could not find skills/ in downloaded archive." if (-not (Test-Path $skillsSource) -or -not (Test-Path $promptsSource)) {
throw "Could not find the bundled skills resources in the downloaded archive."
} }
$installParent = Split-Path $installDir -Parent $installParent = Split-Path $installDir -Parent
@@ -107,6 +108,10 @@ try {
New-Item -ItemType Directory -Path $installDir -Force | Out-Null New-Item -ItemType Directory -Path $installDir -Force | Out-Null
Copy-Item -Path (Join-Path $skillsSource "*") -Destination $installDir -Recurse -Force Copy-Item -Path (Join-Path $skillsSource "*") -Destination $installDir -Recurse -Force
New-Item -ItemType Directory -Path (Join-Path $installDir "prompts") -Force | Out-Null
Copy-Item -Path (Join-Path $promptsSource "*") -Destination (Join-Path $installDir "prompts") -Recurse -Force
Copy-Item -Path (Join-Path $sourceRoot.FullName "AGENTS.md") -Destination (Join-Path $installDir "AGENTS.md") -Force
Copy-Item -Path (Join-Path $sourceRoot.FullName "CONTRIBUTING.md") -Destination (Join-Path $installDir "CONTRIBUTING.md") -Force
Write-Host "==> Installed skills to $installDir" Write-Host "==> Installed skills to $installDir"
if ($Scope -eq "Repo") { if ($Scope -eq "Repo") {

View File

@@ -125,12 +125,18 @@ Workarounds:
New-Item -ItemType Directory -Path $installBinDir -Force | Out-Null New-Item -ItemType Directory -Path $installBinDir -Force | Out-Null
$shimPath = Join-Path $installBinDir "feynman.cmd" $shimPath = Join-Path $installBinDir "feynman.cmd"
$shimPs1Path = Join-Path $installBinDir "feynman.ps1"
Write-Host "==> Linking feynman into $installBinDir" Write-Host "==> Linking feynman into $installBinDir"
@" @"
@echo off @echo off
"$bundleDir\feynman.cmd" %* CALL "$bundleDir\feynman.cmd" %*
"@ | Set-Content -Path $shimPath -Encoding ASCII "@ | Set-Content -Path $shimPath -Encoding ASCII
@"
`$BundleDir = "$bundleDir"
& "`$BundleDir\node\node.exe" "`$BundleDir\app\bin\feynman.js" @args
"@ | Set-Content -Path $shimPs1Path -Encoding UTF8
$currentUserPath = [Environment]::GetEnvironmentVariable("Path", "User") $currentUserPath = [Environment]::GetEnvironmentVariable("Path", "User")
$alreadyOnPath = $false $alreadyOnPath = $false
if ($currentUserPath) { if ($currentUserPath) {

View File

@@ -55,20 +55,20 @@ Or install them repo-locally:
& ([scriptblock]::Create((irm https://feynman.is/install-skills.ps1))) -Scope Repo & ([scriptblock]::Create((irm https://feynman.is/install-skills.ps1))) -Scope Repo
``` ```
These installers download only the `skills/` tree from the Feynman repository. They do not install the Feynman terminal, bundled Node runtime, auth storage, or Pi packages. These installers download the bundled `skills/` and `prompts/` trees plus the repo guidance files referenced by those skills. They do not install the Feynman terminal, bundled Node runtime, auth storage, or Pi packages.
## Pinned releases ## Pinned releases
The one-line installer already targets the latest tagged release. To pin an exact version, pass it explicitly: The one-line installer already targets the latest tagged release. To pin an exact version, pass it explicitly:
```bash ```bash
curl -fsSL https://feynman.is/install | bash -s -- 0.2.16 curl -fsSL https://feynman.is/install | bash -s -- 0.2.17
``` ```
On Windows: On Windows:
```powershell ```powershell
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.16 & ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.17
``` ```
## pnpm ## pnpm