31 Commits

Author SHA1 Message Date
Advait Paliwal
b3a82d4a92 switch release workflow to binary only 2026-04-10 11:02:50 -07:00
Advait Paliwal
790824af20 verify rpc and website gates 2026-04-10 10:49:54 -07:00
Advait Paliwal
4137a29507 remove stale web access override 2026-04-10 10:20:31 -07:00
Advait Paliwal
5b9362918e document local model setup 2026-04-09 13:45:19 -07:00
Advait Paliwal
bfa538fa00 triage remaining tracker fixes 2026-04-09 10:34:29 -07:00
Advait Paliwal
96234425ba harden installers rendering and dependency hygiene 2026-04-09 10:27:23 -07:00
Advait Paliwal
3148f2e62b fix startup packaging and content guardrails 2026-04-09 10:09:05 -07:00
Advait Paliwal
554350cc0e Finish backlog cleanup for Pi integration 2026-03-31 11:02:07 -07:00
Advait Paliwal
d9812cf4f2 Fix Pi package updates and merge feynman-model 2026-03-31 09:18:05 -07:00
Advait Paliwal
aed607ce62 release: bump to 0.2.16 2026-03-28 21:46:57 -07:00
Advait Paliwal
ab8a284c74 fix: respect feynman agent dir in vendored pi-subagents 2026-03-28 21:44:50 -07:00
Advait Paliwal
62d63be1d8 chore: remove valichord integration 2026-03-28 13:56:48 -07:00
Advait Paliwal
e2fdf0d505 fix: exclude release bundles from npm publish 2026-03-27 14:04:16 -07:00
Advait Paliwal
cba7532d59 release: bump to 0.2.15 2026-03-27 13:58:55 -07:00
topeuph-ai
2dea96f25f feat: add valichord-validation skill — blind commit-reveal reproducibility verification
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 13:57:41 -07:00
Advait Paliwal
83a570235f docs: add contributor guide and repo skill 2026-03-27 12:09:09 -07:00
Advait Paliwal
ff6328121e fix: align .nvmrc with supported node floor 2026-03-27 11:36:49 -07:00
Advait Paliwal
404c8b5469 Unify installers on tagged releases 2026-03-26 18:17:48 -07:00
Advait Paliwal
4c62e78ca5 fix: enforce bundled node version floor 2026-03-26 17:49:11 -07:00
Advait Paliwal
10c93a673b fix: align declared node version floor 2026-03-26 17:22:56 -07:00
Mochamad Chairulridjal
30d07246d1 feat: add API key and custom provider configuration (#4)
* feat: add API key and custom provider configuration

Previously, model setup only offered OAuth login. This adds:

- API key configuration for 17 built-in providers (OpenAI, Anthropic,
  Google, Mistral, Groq, xAI, OpenRouter, etc.)
- Custom provider setup via models.json (for Ollama, vLLM, LM Studio,
  proxies, or any OpenAI/Anthropic/Google-compatible endpoint)
- Interactive prompts with smart defaults and auto-detection of models
- Verification flow that probes endpoints and provides actionable tips
- Doctor diagnostics for models.json path and missing apiKey warnings
- Dev environment fallback for running without dist/ build artifacts
- Unified auth flow: `feynman model login` now offers both API key
  and OAuth options (OAuth-only when a specific provider is given)

New files:
- src/model/models-json.ts: Read/write models.json with proper merging
- src/model/registry.ts: Centralized ModelRegistry creation with modelsJsonPath
- tests/models-json.test.ts: Unit tests for provider config upsert

* fix: harden runtime env and custom provider auth

---------

Co-authored-by: Advait Paliwal <advaitspaliwal@gmail.com>
2026-03-26 17:09:38 -07:00
Jeremy
dbd89d8e3d Claude/windows install compatibility tr di s (#3)
* Fix Windows PowerShell 5.1 compatibility in installer

Use $env:PROCESSOR_ARCHITECTURE for arch detection instead of
RuntimeInformation::OSArchitecture which may not be loaded in
every Windows PowerShell 5.1 session. Also fix null-reference
when user PATH environment variable is empty.

https://claude.ai/code/session_01VFiRDM2ZweyacXN5JneVoP

* Fix executable resolution and tar extraction on Windows

resolveExecutable() used `sh -lc "command -v ..."` which doesn't work
on Windows (no sh). Now uses `cmd /c where` on win32. Also make tar
workspace restoration tolerate symlink failures on Windows — .bin/
symlinks can't be created without Developer Mode, but the actual
package directories are extracted fine.

https://claude.ai/code/session_01VFiRDM2ZweyacXN5JneVoP

* Broad Windows compatibility fixes across the codebase

- runtime.ts: Use path.delimiter instead of hardcoded ":" for PATH
  construction — was completely broken on Windows
- executables.ts: Add Windows fallback paths for Chrome, Edge, Brave,
  and Pandoc in Program Files; skip macOS-only paths on win32
- node-version.ts, check-node-version.mjs, bin/feynman.js: Show
  Windows-appropriate install instructions (irm | iex, nodejs.org)
  instead of nvm/curl on win32
- preview.ts: Support winget for pandoc auto-install on Windows, and
  apt on Linux (was macOS/brew only)
- launch.ts: Catch unsupported signal errors on Windows
- README.md: Add Windows PowerShell commands alongside macOS/Linux
  for all install instructions

https://claude.ai/code/session_01VFiRDM2ZweyacXN5JneVoP

* fix: complete windows bootstrap hardening

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Advait Paliwal <advaitspaliwal@gmail.com>
2026-03-26 17:08:14 -07:00
Advait Paliwal
c8536583bf Add skills-only installers 2026-03-25 14:52:20 -07:00
Advait Paliwal
ca74226c83 Fix mobile website overflow 2026-03-25 14:42:08 -07:00
Advait Paliwal
bc9fa2be86 Fix runtime package resolution and tty shutdown 2026-03-25 14:02:38 -07:00
Advait Paliwal
f6dbacc9d5 Update runtime checks and installer behavior 2026-03-25 13:55:32 -07:00
Advait Paliwal
572de7ba85 Clean up README: single install line, fix replicate descriptions
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 12:29:34 -07:00
Advait Paliwal
85e0c4d8c4 Register alphaXiv research tools as native Pi tools
Replace the alpha-research CLI skill with direct programmatic Pi tool
registrations via @companion-ai/alpha-hub/lib. Tools connect to alphaXiv's
MCP server through the library and reuse the connection across calls
instead of spawning a new CLI process each time.

Registers: alpha_search, alpha_get_paper, alpha_ask_paper,
alpha_annotate_paper, alpha_list_annotations, alpha_read_code.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 11:07:42 -07:00
Advait Paliwal
584d065902 Open OAuth login URLs during setup 2026-03-25 10:04:45 -07:00
Advait Paliwal
151956ea24 Prune removed bundled skills during bootstrap sync 2026-03-25 01:37:08 -07:00
Advait Paliwal
75b0467761 Fix release publish job and add eli5 skill 2026-03-25 01:33:10 -07:00
104 changed files with 5696 additions and 527 deletions

154
.astro/content.d.ts vendored Normal file
View File

@@ -0,0 +1,154 @@
declare module 'astro:content' {
export interface RenderResult {
Content: import('astro/runtime/server/index.js').AstroComponentFactory;
headings: import('astro').MarkdownHeading[];
remarkPluginFrontmatter: Record<string, any>;
}
interface Render {
'.md': Promise<RenderResult>;
}
export interface RenderedContent {
html: string;
metadata?: {
imagePaths: Array<string>;
[key: string]: unknown;
};
}
type Flatten<T> = T extends { [K: string]: infer U } ? U : never;
export type CollectionKey = keyof DataEntryMap;
export type CollectionEntry<C extends CollectionKey> = Flatten<DataEntryMap[C]>;
type AllValuesOf<T> = T extends any ? T[keyof T] : never;
export type ReferenceDataEntry<
C extends CollectionKey,
E extends keyof DataEntryMap[C] = string,
> = {
collection: C;
id: E;
};
export type ReferenceLiveEntry<C extends keyof LiveContentConfig['collections']> = {
collection: C;
id: string;
};
export function getCollection<C extends keyof DataEntryMap, E extends CollectionEntry<C>>(
collection: C,
filter?: (entry: CollectionEntry<C>) => entry is E,
): Promise<E[]>;
export function getCollection<C extends keyof DataEntryMap>(
collection: C,
filter?: (entry: CollectionEntry<C>) => unknown,
): Promise<CollectionEntry<C>[]>;
export function getLiveCollection<C extends keyof LiveContentConfig['collections']>(
collection: C,
filter?: LiveLoaderCollectionFilterType<C>,
): Promise<
import('astro').LiveDataCollectionResult<LiveLoaderDataType<C>, LiveLoaderErrorType<C>>
>;
export function getEntry<
C extends keyof DataEntryMap,
E extends keyof DataEntryMap[C] | (string & {}),
>(
entry: ReferenceDataEntry<C, E>,
): E extends keyof DataEntryMap[C]
? Promise<DataEntryMap[C][E]>
: Promise<CollectionEntry<C> | undefined>;
export function getEntry<
C extends keyof DataEntryMap,
E extends keyof DataEntryMap[C] | (string & {}),
>(
collection: C,
id: E,
): E extends keyof DataEntryMap[C]
? string extends keyof DataEntryMap[C]
? Promise<DataEntryMap[C][E]> | undefined
: Promise<DataEntryMap[C][E]>
: Promise<CollectionEntry<C> | undefined>;
export function getLiveEntry<C extends keyof LiveContentConfig['collections']>(
collection: C,
filter: string | LiveLoaderEntryFilterType<C>,
): Promise<import('astro').LiveDataEntryResult<LiveLoaderDataType<C>, LiveLoaderErrorType<C>>>;
/** Resolve an array of entry references from the same collection */
export function getEntries<C extends keyof DataEntryMap>(
entries: ReferenceDataEntry<C, keyof DataEntryMap[C]>[],
): Promise<CollectionEntry<C>[]>;
export function render<C extends keyof DataEntryMap>(
entry: DataEntryMap[C][string],
): Promise<RenderResult>;
export function reference<
C extends
| keyof DataEntryMap
// Allow generic `string` to avoid excessive type errors in the config
// if `dev` is not running to update as you edit.
// Invalid collection names will be caught at build time.
| (string & {}),
>(
collection: C,
): import('astro/zod').ZodPipe<
import('astro/zod').ZodString,
import('astro/zod').ZodTransform<
C extends keyof DataEntryMap
? {
collection: C;
id: string;
}
: never,
string
>
>;
type ReturnTypeOrOriginal<T> = T extends (...args: any[]) => infer R ? R : T;
type InferEntrySchema<C extends keyof DataEntryMap> = import('astro/zod').infer<
ReturnTypeOrOriginal<Required<ContentConfig['collections'][C]>['schema']>
>;
type ExtractLoaderConfig<T> = T extends { loader: infer L } ? L : never;
type InferLoaderSchema<
C extends keyof DataEntryMap,
L = ExtractLoaderConfig<ContentConfig['collections'][C]>,
> = L extends { schema: import('astro/zod').ZodSchema }
? import('astro/zod').infer<L['schema']>
: any;
type DataEntryMap = {
};
type ExtractLoaderTypes<T> = T extends import('astro/loaders').LiveLoader<
infer TData,
infer TEntryFilter,
infer TCollectionFilter,
infer TError
>
? { data: TData; entryFilter: TEntryFilter; collectionFilter: TCollectionFilter; error: TError }
: { data: never; entryFilter: never; collectionFilter: never; error: never };
type ExtractEntryFilterType<T> = ExtractLoaderTypes<T>['entryFilter'];
type ExtractCollectionFilterType<T> = ExtractLoaderTypes<T>['collectionFilter'];
type ExtractErrorType<T> = ExtractLoaderTypes<T>['error'];
type LiveLoaderDataType<C extends keyof LiveContentConfig['collections']> =
LiveContentConfig['collections'][C]['schema'] extends undefined
? ExtractDataType<LiveContentConfig['collections'][C]['loader']>
: import('astro/zod').infer<
Exclude<LiveContentConfig['collections'][C]['schema'], undefined>
>;
type LiveLoaderEntryFilterType<C extends keyof LiveContentConfig['collections']> =
ExtractEntryFilterType<LiveContentConfig['collections'][C]['loader']>;
type LiveLoaderCollectionFilterType<C extends keyof LiveContentConfig['collections']> =
ExtractCollectionFilterType<LiveContentConfig['collections'][C]['loader']>;
type LiveLoaderErrorType<C extends keyof LiveContentConfig['collections']> = ExtractErrorType<
LiveContentConfig['collections'][C]['loader']
>;
export type ContentConfig = never;
export type LiveContentConfig = never;
}

2
.astro/types.d.ts vendored Normal file
View File

@@ -0,0 +1,2 @@
/// <reference types="astro/client" />
/// <reference path="content.d.ts" />

View File

@@ -6,6 +6,20 @@ FEYNMAN_THINKING=medium
OPENAI_API_KEY= OPENAI_API_KEY=
ANTHROPIC_API_KEY= ANTHROPIC_API_KEY=
GEMINI_API_KEY=
OPENROUTER_API_KEY=
ZAI_API_KEY=
KIMI_API_KEY=
MINIMAX_API_KEY=
MINIMAX_CN_API_KEY=
MISTRAL_API_KEY=
GROQ_API_KEY=
XAI_API_KEY=
CEREBRAS_API_KEY=
HF_TOKEN=
OPENCODE_API_KEY=
AI_GATEWAY_API_KEY=
AZURE_OPENAI_API_KEY=
RUNPOD_API_KEY= RUNPOD_API_KEY=
MODAL_TOKEN_ID= MODAL_TOKEN_ID=

View File

@@ -9,7 +9,7 @@ Operating rules:
- State uncertainty explicitly. - State uncertainty explicitly.
- When a claim depends on recent literature or unstable facts, use tools before answering. - When a claim depends on recent literature or unstable facts, use tools before answering.
- When discussing papers, cite title, year, and identifier or URL when possible. - When discussing papers, cite title, year, and identifier or URL when possible.
- Use the alpha-research skill for academic paper search, paper reading, paper Q&A, repository inspection, and persistent annotations. - Use the `alpha` CLI for academic paper search, paper reading, paper Q&A, repository inspection, and persistent annotations.
- Use `web_search`, `fetch_content`, and `get_search_content` first for current topics: products, companies, markets, regulations, software releases, model availability, model pricing, benchmarks, docs, or anything phrased as latest/current/recent/today. - Use `web_search`, `fetch_content`, and `get_search_content` first for current topics: products, companies, markets, regulations, software releases, model availability, model pricing, benchmarks, docs, or anything phrased as latest/current/recent/today.
- For mixed topics, combine both: use web sources for current reality and paper sources for background literature. - For mixed topics, combine both: use web sources for current reality and paper sources for background literature.
- Never answer a latest/current question from arXiv or alpha-backed paper search alone. - Never answer a latest/current question from arXiv or alpha-backed paper search alone.

View File

@@ -1,5 +1,6 @@
{ {
"packages": [ "packages": [
"npm:@companion-ai/alpha-hub",
"npm:pi-subagents", "npm:pi-subagents",
"npm:pi-btw", "npm:pi-btw",
"npm:pi-docparser", "npm:pi-docparser",

View File

@@ -13,7 +13,7 @@ jobs:
runs-on: blacksmith-4vcpu-ubuntu-2404 runs-on: blacksmith-4vcpu-ubuntu-2404
outputs: outputs:
version: ${{ steps.version.outputs.version }} version: ${{ steps.version.outputs.version }}
should_publish: ${{ steps.version.outputs.should_publish }} should_release: ${{ steps.version.outputs.should_release }}
steps: steps:
- uses: actions/checkout@v6 - uses: actions/checkout@v6
- uses: actions/setup-node@v5 - uses: actions/setup-node@v5
@@ -21,37 +21,20 @@ jobs:
node-version: 24.14.0 node-version: 24.14.0
- id: version - id: version
shell: bash shell: bash
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: | run: |
CURRENT=$(npm view @companion-ai/feynman version 2>/dev/null || echo "0.0.0")
LOCAL=$(node -p "require('./package.json').version") LOCAL=$(node -p "require('./package.json').version")
echo "version=$LOCAL" >> "$GITHUB_OUTPUT" echo "version=$LOCAL" >> "$GITHUB_OUTPUT"
if [ "$CURRENT" != "$LOCAL" ]; then if gh release view "v$LOCAL" >/dev/null 2>&1; then
echo "should_publish=true" >> "$GITHUB_OUTPUT" echo "should_release=false" >> "$GITHUB_OUTPUT"
else else
echo "should_publish=false" >> "$GITHUB_OUTPUT" echo "should_release=true" >> "$GITHUB_OUTPUT"
fi fi
publish-npm:
needs: version-check
if: needs.version-check.outputs.should_publish == 'true'
runs-on: blacksmith-4vcpu-ubuntu-2404
permissions:
contents: read
steps:
- uses: actions/checkout@v6
- uses: actions/setup-node@v5
with:
node-version: 24.14.0
registry-url: https://registry.npmjs.org
- run: npm ci --ignore-scripts
- run: npm run build
- run: npm test
- run: npm publish --access public
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
build-native-bundles: build-native-bundles:
needs: version-check needs: version-check
if: needs.version-check.outputs.should_release == 'true'
strategy: strategy:
fail-fast: false fail-fast: false
matrix: matrix:
@@ -97,52 +80,11 @@ jobs:
name: native-${{ matrix.id }} name: native-${{ matrix.id }}
path: dist/release/* path: dist/release/*
release-edge:
needs:
- version-check
- build-native-bundles
if: needs.build-native-bundles.result == 'success'
runs-on: blacksmith-4vcpu-ubuntu-2404
permissions:
contents: write
steps:
- uses: actions/download-artifact@v4
with:
path: release-assets
merge-multiple: true
- shell: bash
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
VERSION: ${{ needs.version-check.outputs.version }}
run: |
NOTES="Rolling Feynman bundles from main for the curl/PowerShell installer."
if gh release view edge >/dev/null 2>&1; then
gh release view edge --json assets --jq '.assets[].name' | while IFS= read -r asset; do
[ -n "$asset" ] || continue
gh release delete-asset edge "$asset" --yes
done
gh release upload edge release-assets/*
gh release edit edge \
--title "edge" \
--notes "$NOTES" \
--prerelease \
--draft=false \
--target "$GITHUB_SHA"
else
gh release create edge release-assets/* \
--title "edge" \
--notes "$NOTES" \
--prerelease \
--latest=false \
--target "$GITHUB_SHA"
fi
release-github: release-github:
needs: needs:
- version-check - version-check
- publish-npm
- build-native-bundles - build-native-bundles
if: needs.version-check.outputs.should_publish == 'true' && needs.build-native-bundles.result == 'success' && needs.publish-npm.result == 'success' if: needs.version-check.outputs.should_release == 'true' && needs.build-native-bundles.result == 'success'
runs-on: blacksmith-4vcpu-ubuntu-2404 runs-on: blacksmith-4vcpu-ubuntu-2404
permissions: permissions:
contents: write contents: write
@@ -153,6 +95,7 @@ jobs:
merge-multiple: true merge-multiple: true
- shell: bash - shell: bash
env: env:
GH_REPO: ${{ github.repository }}
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
VERSION: ${{ needs.version-check.outputs.version }} VERSION: ${{ needs.version-check.outputs.version }}
run: | run: |

1
.nvmrc Normal file
View File

@@ -0,0 +1 @@
20.19.0

View File

@@ -14,3 +14,156 @@ Use this file to track chronology, not release notes. Keep entries short, factua
- Failed / learned: ... - Failed / learned: ...
- Blockers: ... - Blockers: ...
- Next: ... - Next: ...
### 2026-03-25 00:00 local — scaling-laws
- Objective: Set up a deep research workflow for scaling laws.
- Changed: Created plan artifact at `outputs/.plans/scaling-laws.md`; defined 4 disjoint researcher dimensions and acceptance criteria.
- Verified: Read `CHANGELOG.md` and checked prior memory for related plan `scaling-laws-implications`.
- Failed / learned: No prior run-specific changelog entries existed beyond the template.
- Blockers: Waiting for user confirmation before launching researcher round 1.
- Next: On confirmation, spawn 4 parallel researcher subagents and begin evidence collection.
### 2026-03-25 00:30 local — scaling-laws (T4 inference/time-scale pass)
- Objective: Complete T4 on inference/test-time scaling and reasoning-time compute, scoped to 20232026.
- Changed: Wrote `notes/scaling-laws-research-inference.md`; updated `outputs/.plans/scaling-laws.md` to mark T4 done and log the inference-scaling verification pass.
- Verified: Cross-read 13 primary/official sources covering Tree-of-Thoughts, PRMs, repeated sampling, compute-optimal test-time scaling, provable laws, o1, DeepSeek-R1, s1, verifier failures, Anthropic extended thinking, and OpenAI reasoning API docs.
- Failed / learned: OpenAI blog fetch for `learning-to-reason-with-llms` returned malformed content, so the note leans on the o1 system card and API docs instead of that blog post.
- Blockers: T2 and T5 remain open before final synthesis; no single unified law for inference-time scaling emerged from public sources.
- Next: Complete T5 implications synthesis, then reconcile T3/T4 with foundational T2 before drafting the cited brief.
### 2026-03-25 11:20 local — scaling-laws (T6 draft synthesis)
- Objective: Synthesize the four research notes into a single user-facing draft brief for the scaling-laws workflow.
- Changed: Wrote `outputs/.drafts/scaling-laws-draft.md` with an executive summary, curated reading list, qualitative meta-analysis, core-paper comparison table, explicit training-vs-inference distinction, and numbered inline citations with direct-URL sources.
- Verified: Cross-checked the draft against `notes/scaling-laws-research-foundations.md`, `notes/scaling-laws-research-revisions.md`, `notes/scaling-laws-research-inference.md`, and `notes/scaling-laws-research-implications.md` to ensure the brief explicitly states the literature is too heterogeneous for a pooled effect-size estimate.
- Failed / learned: The requested temp-run `context.md` and `plan.md` were absent, so the synthesis used `outputs/.plans/scaling-laws.md` plus the four note files as the working context.
- Blockers: Citation/claim verification pass still pending; this draft should be treated as pre-verification.
- Next: Run verifier/reviewer passes, then promote the draft into the final cited brief and provenance sidecar.
### 2026-03-25 11:28 local — scaling-laws (final brief + pdf)
- Objective: Deliver a paper guide and qualitative meta-analysis on AI scaling laws.
- Changed: Finalized `outputs/scaling-laws.md` and sidecar `outputs/scaling-laws.provenance.md`; rendered preview PDF at `outputs/scaling-laws.pdf`; updated plan ledger and verification log in `outputs/.plans/scaling-laws.md`.
- Verified: Ran a reviewer pass recorded in `notes/scaling-laws-verification.md`; spot-checked key primary papers via alpha-backed reads for Kaplan 2020, Chinchilla 2022, and Snell 2024; confirmed PDF render output exists.
- Failed / learned: A pooled statistical meta-analysis would be misleading because the literature mixes heterogeneous outcomes, scaling axes, and evaluation regimes; final deliverable uses a qualitative meta-analysis instead.
- Blockers: None for this brief.
- Next: If needed, extend into a narrower sub-survey (e.g. only pretraining laws, only inference-time scaling, or only post-Chinchilla data-quality revisions).
### 2026-03-25 14:52 local — skills-only-install
- Objective: Let users download the Feynman research skills without installing the full terminal runtime.
- Changed: Added standalone skills-only installers at `scripts/install/install-skills.sh` and `scripts/install/install-skills.ps1`; synced website-public copies; documented user-level and repo-local install flows in `README.md`, `website/src/content/docs/getting-started/installation.md`, and `website/src/pages/index.astro`.
- Verified: Ran `sh -n scripts/install/install-skills.sh`; ran `node scripts/sync-website-installers.mjs`; ran `cd website && npm run build`; executed `sh scripts/install/install-skills.sh --dir <tmp>` and confirmed extracted `SKILL.md` files land in the target directory.
- Failed / learned: PowerShell installer behavior was not executed locally because PowerShell is not installed in this environment.
- Blockers: None for the Unix installer flow; Windows remains syntax-only by inspection.
- Next: If users want this exposed more prominently, add a dedicated docs/reference page and a homepage-specific skills-only CTA instead of a text link.
### 2026-03-26 18:08 PDT — installer-release-unification
- Objective: Remove the moving `edge` installer channel and unify installs on tagged releases only.
- Changed: Updated `scripts/install/install.sh`, `scripts/install/install.ps1`, `scripts/install/install-skills.sh`, and `scripts/install/install-skills.ps1` so the default target is the latest tagged release, latest-version resolution uses public GitHub release pages instead of `api.github.com`, and explicit `edge` requests now fail with a removal message; removed the `release-edge` job from `.github/workflows/publish.yml`; updated `README.md` and `website/src/content/docs/getting-started/installation.md`; re-synced `website/public/install*`.
- Verified: Ran `sh -n` on the Unix installer copies; confirmed `sh scripts/install/install.sh edge` and `sh scripts/install/install-skills.sh edge --dir <tmp>` fail with the intended removal message; executed `sh scripts/install/install.sh` into temp dirs and confirmed the installed binary reports `0.2.14`; executed `sh scripts/install/install-skills.sh --dir <tmp>` and confirmed extracted `SKILL.md` files; ran `cd website && npm run build`.
- Failed / learned: The install failure was caused by unauthenticated GitHub API rate limiting on the `edge` path, so renaming channels without removing the API dependency would not have fixed the root cause.
- Blockers: `npm run build` still emits a pre-existing duplicate-content warning for `getting-started/installation`; the build succeeds.
- Next: If desired, remove the now-unused `stable` alias too and clean up the duplicate docs-content warning separately.
### 2026-03-27 11:58 PDT — release-0.2.15
- Objective: Make the non-Anthropic subagent/auth fixes and contributor-guide updates releasable to tagged-install users instead of leaving them only on `main`.
- Changed: Bumped the package version from `0.2.14` to `0.2.15` in `package.json` and `package-lock.json`; updated pinned installer examples in `README.md` and `website/src/content/docs/getting-started/installation.md`; aligned the local-development docs example to the npm-based root workflow; added `CONTRIBUTING.md` plus the bundled `skills/contributing/SKILL.md`.
- Verified: Confirmed the publish workflow keys off `package.json` versus the currently published npm version; confirmed local `npm test`, `npm run typecheck`, and `npm run build` pass before the release bump.
- Failed / learned: The open subagent issue is fixed on `main` but still user-visible on tagged installs until a fresh release is cut.
- Blockers: Need the GitHub publish workflow to finish successfully before the issue can be honestly closed as released.
- Next: Push `0.2.15`, monitor the publish workflow, then update and close the relevant GitHub issue/PR once the release is live.
### 2026-03-28 15:15 PDT — pi-subagents-agent-dir-compat
- Objective: Debug why tagged installs can still fail subagent/auth flows after `0.2.15` when users are not on Anthropic.
- Changed: Added `scripts/lib/pi-subagents-patch.mjs` plus type declarations and wired `scripts/patch-embedded-pi.mjs` to rewrite vendored `pi-subagents` runtime files so they resolve user-scoped paths from `PI_CODING_AGENT_DIR` instead of hardcoded `~/.pi/agent`; added `tests/pi-subagents-patch.test.ts`.
- Verified: Materialized `.feynman/npm`, inspected the shipped `pi-subagents@0.11.11` sources, confirmed the hardcoded `~/.pi/agent` paths in `index.ts`, `agents.ts`, `artifacts.ts`, `run-history.ts`, `skills.ts`, and `chain-clarify.ts`; ran `node scripts/patch-embedded-pi.mjs`; ran `npm test`, `npm run typecheck`, and `npm run build`.
- Failed / learned: The earlier `0.2.15` fix only proved that Feynman exported `PI_CODING_AGENT_DIR` to the top-level Pi child; it did not cover vendored extension code that still hardcoded `.pi` paths internally.
- Blockers: Users still need a release containing this patch before tagged installs benefit from it.
- Next: Cut the next release and verify a tagged install exercises subagents without reading from `~/.pi/agent`.
### 2026-03-28 21:46 PDT — release-0.2.16
- Objective: Ship the vendored `pi-subagents` agent-dir compatibility fix to tagged installs.
- Changed: Bumped the package version from `0.2.15` to `0.2.16` in `package.json` and `package-lock.json`; updated pinned installer examples in `README.md` and `website/src/content/docs/getting-started/installation.md`.
- Verified: Re-ran `npm test`, `npm run typecheck`, and `npm run build`; ran `cd website && npm run build`; ran `npm pack` and confirmed the `0.2.16` tarball includes the new `scripts/lib/pi-subagents-patch.*` files.
- Failed / learned: An initial local `build:native-bundle` check failed because `npm pack` and `build:native-bundle` were run in parallel, and `prepack` intentionally removes `dist/release`; rerunning `npm run build:native-bundle` sequentially succeeded.
- Blockers: None in the repo; publishing still depends on the GitHub workflow running on the bumped version.
- Next: Push the `0.2.16` release bump and monitor npm/GitHub release publication.
### 2026-03-31 10:45 PDT — pi-maintenance-issues-prs
- Objective: Triage open Pi-related issues/PRs, fix the concrete package update regression, and refresh Pi dependencies against current upstream releases.
- Changed: Pinned direct package-manager operations (`feynman update`, `feynman packages install`) to Feynman's npm prefix by exporting `FEYNMAN_NPM_PREFIX`, `NPM_CONFIG_PREFIX`, and `npm_config_prefix` before invoking Pi's `DefaultPackageManager`; bumped `@mariozechner/pi-ai` and `@mariozechner/pi-coding-agent` from `0.62.0` to `0.64.0`; adapted `src/model/registry.ts` to the new `ModelRegistry.create(...)` factory; integrated PR #15's `/feynman-model` command on top of current `main`.
- Verified: Ran `npm test`, `npm run typecheck`, and `npm run build` successfully after the dependency bump and PR integration; confirmed upstream `pi-coding-agent@0.64.0` still uses `npm install -g` for user-scope package updates, so the Feynman-side prefix fix is still required.
- Failed / learned: PR #14 is a stale branch with no clean merge path against current `main`; the only user-facing delta is the ValiChord prompt/skill addition, and the branch also carries unrelated release churn plus demo-style material, so it was not merged in this pass.
- Blockers: None in the local repo state; remote merge/push still depends on repository credentials and branch policy.
- Next: If remote write access is available, commit and push the validated maintenance changes, then close issue #22 and resolve PR #15 as merged while leaving PR #14 unmerged pending a cleaned-up, non-promotional resubmission.
### 2026-03-31 12:05 PDT — pi-backlog-cleanup-round-2
- Objective: Finish the remaining high-confidence open tracker items after the Pi 0.64.0 upgrade instead of leaving the issue list half-reconciled.
- Changed: Added a Windows extension-loader patch helper so Feynman rewrites Pi extension imports to `file://` URLs on Windows before interactive startup; added `/commands`, `/tools`, and `/capabilities` discovery commands and surfaced `/hotkeys` plus `/service-tier` in help metadata; added explicit service-tier support via `feynman model tier`, `--service-tier`, status/doctor output, and a provider-payload hook that passes `service_tier` only to supported OpenAI/OpenAI Codex/Anthropic models; added Exa provider recognition to Feynman's web-search status layer and vendored `pi-web-access`.
- Verified: Ran `npm test`, `npm run typecheck`, and `npm run build`; smoke-imported the modified vendored `pi-web-access` modules with `node --import tsx`.
- Failed / learned: The remaining ValiChord PR is still stale and mixes a real prompt/skill update with unrelated branch churn; it is a review/triage item, not a clean merge candidate.
- Blockers: No local build blockers remain; issue/PR closure still depends on the final push landing on `main`.
- Next: Push the verified cleanup commit, then close issues fixed by the dependency bump plus the new discoverability/service-tier/Windows patches, and close the stale ValiChord PR explicitly instead of leaving it open indefinitely.
### 2026-04-09 09:37 PDT — windows-startup-import-specifiers
- Objective: Fix Windows startup failures where `feynman` exits before the Pi child process initializes.
- Changed: Converted the Node preload module paths passed via `node --import` in `src/pi/launch.ts` to `file://` specifiers using a new `toNodeImportSpecifier(...)` helper in `src/pi/runtime.ts`; expanded `scripts/patch-embedded-pi.mjs` so it also patches the bundled workspace copy of Pi's extension loader when present.
- Verified: Added a regression test in `tests/pi-runtime.test.ts` covering absolute-path to `file://` conversion for preload imports; ran `npm test`, `npm run typecheck`, and `npm run build`.
- Failed / learned: The raw Windows `ERR_UNSUPPORTED_ESM_URL_SCHEME` stack is more consistent with Node rejecting the child-process `--import C:\\...` preload before Pi starts than with a normal in-app extension load failure.
- Blockers: Windows runtime execution was not available locally, so the fix is verified by code path inspection and automated tests rather than an actual Windows shell run.
- Next: Ask the affected user to reinstall or update to the next published package once released, and confirm the Windows REPL now starts from a normal PowerShell session.
### 2026-04-09 11:02 PDT — tracker-hardening-pass
- Objective: Triage the open repo backlog, land the highest-signal fixes locally, and add guardrails against stale promotional workflow content.
- Changed: Hardened Windows launch paths in `bin/feynman.js`, `scripts/build-native-bundle.mjs`, and `scripts/install/install.ps1`; set npm prefix overrides earlier in `scripts/patch-embedded-pi.mjs`; added a `pi-web-access` runtime patch helper plus `FEYNMAN_WEB_SEARCH_CONFIG` env wiring so bundled web search reads the same `~/.feynman/web-search.json` that doctor/status report; taught `src/pi/web-access.ts` to honor the legacy `route` key; fixed bundled skill references and expanded the skills-only installers/docs to ship the prompt and guidance files those skills reference; added regression tests for config paths, catalog snapshot edges, skill-path packaging, `pi-web-access` patching, and blocked promotional content.
- Verified: Ran `npm test`, `npm run typecheck`, and `npm run build` successfully after the full maintenance pass.
- Failed / learned: The skills-only install issue was not just docs drift; the shipped `SKILL.md` files referenced prompt paths that only made sense after installation, so the repo needed both path normalization and packaging changes.
- Blockers: Remote issue/PR closure and merge actions still depend on the final reviewed branch state being pushed.
- Next: Push the validated fixes, close the duplicate Windows/reporting issues they supersede, reject the promotional ValiChord PR explicitly, and then review whether the remaining docs-only or feature PRs should be merged separately.
### 2026-04-09 10:28 PDT — verification-and-security-pass
- Objective: Run a deeper install/security verification pass against the post-cleanup `0.2.17` tree instead of assuming the earlier targeted fixes covered the shipped artifacts.
- Changed: Reworked `extensions/research-tools/header.ts` to use `@mariozechner/pi-tui` width-aware helpers for truncation/wrapping so wide Unicode text does not overflow custom header rows; changed `src/pi/launch.ts` to stop mirroring child crash signals back onto the parent process and instead emit a conventional exit code; added `FEYNMAN_INSTALL_SKILLS_ARCHIVE_URL` overrides to the skills installers for pre-release smoke testing; aligned root and website dependency trees with patched transitive versions using npm `overrides`; fixed `src/pi/web-access.ts` so `search status` respects `FEYNMAN_HOME` semantics instead of hardcoding the current shell home directory; added `tests/pi-launch.test.ts`.
- Verified: Ran `npm test`, `npm run typecheck`, `npm run build`, `cd website && npm run build`, `npm run build:native-bundle`; smoke-tested `scripts/install/install.sh` against a locally served `dist/release/feynman-0.2.17-darwin-arm64.tar.gz`; smoke-tested `scripts/install/install-skills.sh` against a local source archive; confirmed installed `feynman --version`, `feynman --help`, `feynman doctor`, and packaged `feynman search status` work from the installed bundle; `npm audit --omit=dev` is clean in the root app and website after overrides.
- Failed / learned: The first packaged `search status` smoke test still showed the user home path because the native bundle had been built before the `FEYNMAN_HOME` path fix; rebuilding the native bundle resolved that mismatch.
- Blockers: PowerShell runtime was unavailable locally, so Windows installer execution remained code-path validated rather than actually executed.
- Next: Push the second-pass hardening commit, then keep issue `#46` and issue `#47` open until users on the affected Linux/CJK environments confirm whether the launcher/header fixes fully resolve them.
### 2026-04-09 10:36 PDT — remaining-tracker-triage-pass
- Objective: Reduce the remaining open tracker items by landing the lowest-risk missing docs/catalog updates and a targeted Cloud Code Assist compatibility patch instead of only hand-triaging them.
- Changed: Added MiniMax M2.7 recommendation preferences in `src/model/catalog.ts`; documented model switching, authenticated-provider visibility, and `/feynman-model` subagent overrides in `website/src/content/docs/getting-started/configuration.md` and `website/src/content/docs/reference/slash-commands.md`; added a runtime patch helper in `scripts/lib/pi-google-legacy-schema-patch.mjs` and wired `scripts/patch-embedded-pi.mjs` to normalize JSON Schema `const` into `enum` for the legacy `parameters` field used by Cloud Code Assist Claude models.
- Verified: Ran `npm test`, `npm run typecheck`, `npm run build`, and `cd website && npm run build` after the patch/helper/docs changes.
- Failed / learned: The MiniMax provider catalog in Pi already uses canonical IDs like `MiniMax-M2.7`, so the only failure during validation was a test assertion using the wrong casing rather than a runtime bug.
- Blockers: The Cloud Code Assist fix is validated by targeted patch tests and code-path review rather than an end-to-end Google account repro in this environment.
- Next: Push the tracker-triage commit, close the docs/MiniMax PRs as superseded by main, close the support-style model issues against the new docs, and decide whether the remaining feature requests should be left open or closed as not planned/upstream-dependent.
### 2026-04-10 10:22 PDT — web-access-stale-override-fix
- Objective: Fix the new `ctx.modelRegistry.getApiKeyAndHeaders is not a function` / stale `search-filter.js` report without reintroducing broad vendor drift.
- Changed: Removed the stale `.feynman/vendor-overrides/pi-web-access/*` files and removed `syncVendorOverride` from `scripts/patch-embedded-pi.mjs`; kept the targeted `pi-web-access` runtime config-path patch; added `feynman search set <provider> [api-key]` and `feynman search clear` commands with a shared save path in `src/pi/web-access.ts`.
- Verified: Ran `npm test`, `npm run typecheck`, `npm run build`; ran `node scripts/patch-embedded-pi.mjs`, confirmed the installed `pi-web-access/index.ts` has no `search-filter` / condense helper references, and smoke-imported `./.feynman/npm/node_modules/pi-web-access/index.ts`; ran `npm pack --dry-run` and confirmed stale `vendor-overrides` files are no longer in the package tarball.
- Failed / learned: The public Linux installer Docker test was attempted but Docker Desktop became unresponsive even for simple `docker run node:22-bookworm node -v` commands; the earlier Linux npm-artifact container smoke remains valid, but this specific public-installer run is blocked by the local Docker daemon.
- Blockers: Issue `#54` is too underspecified to fix directly without logs; public Linux installer behavior still needs a stable Docker daemon or a real Linux shell to reproduce the user's exact npm errors.
- Next: Push the stale-override fix, close PR `#52` and PR `#53` as superseded/merged-by-main once pushed, and ask for logs on issue `#54` instead of guessing.
### 2026-04-10 10:49 PDT — rpc-and-website-verification-pass
- Objective: Exercise the Feynman wrapper's RPC mode and the website quality gates that were not fully covered by the prior passes.
- Changed: Added `--mode <text|json|rpc>` pass-through support in the Feynman wrapper and skipped terminal clearing in RPC mode; added `@astrojs/check` to the website dev dependencies, fixed React Refresh lint violations in the generated UI components by exporting only components, and added safe website dependency overrides for dev-audit findings.
- Verified: Ran a JSONL RPC smoke test through `node bin/feynman.js --mode rpc` with `get_state`; ran `npm test`, `npm run typecheck`, `npm run build`, `cd website && npm run lint`, `cd website && npm run typecheck`, `cd website && npm run build`, full root `npm audit`, full website `npm audit`, and `npm run build:native-bundle`.
- Failed / learned: Website typecheck was previously a no-op prompt because `@astrojs/check` was missing; installing it exposed dev-audit findings that needed explicit overrides before the full website audit was clean.
- Blockers: Docker Desktop remained unreliable after restart attempts, so this pass still does not include a second successful public-installer Linux Docker run.
- Next: Push the RPC/website verification commit and keep future Docker/public-installer validation separate from repo correctness unless Docker is stable.

115
CONTRIBUTING.md Normal file
View File

@@ -0,0 +1,115 @@
# Contributing to Feynman
Feynman is a research-first CLI built on Pi and alphaXiv. This guide is for humans and agents contributing code, prompts, skills, docs, installers, or workflow behavior to the repository.
## Quick Links
- GitHub: https://github.com/getcompanion-ai/feynman
- Docs: https://feynman.is/docs
- Repo agent contract: [AGENTS.md](AGENTS.md)
- Issues: https://github.com/getcompanion-ai/feynman/issues
## What Goes Where
- CLI/runtime code: `src/`
- Bundled prompt templates: `prompts/`
- Bundled Pi skills: `skills/`
- Bundled Pi subagent prompts: `.feynman/agents/`
- Docs site: `website/`
- Build/release scripts: `scripts/`
- Generated research artifacts: `outputs/`, `papers/`, `notes/`
If you need to change how bundled subagents behave, edit `.feynman/agents/*.md`. Do not duplicate that behavior in `AGENTS.md`.
## Before You Open a PR
1. Start from the latest `main`.
2. Use Node.js `20.19.0` or newer. The repo expects `.nvmrc`, `package.json` engines, `website/package.json` engines, and the runtime version guard to stay aligned.
3. Install dependencies from the repo root:
```bash
nvm use || nvm install
npm install
```
4. Run the required checks before asking for review:
```bash
npm test
npm run typecheck
npm run build
```
5. If you changed the docs site, also validate the website:
```bash
cd website
npm install
npm run build
```
6. Keep the PR focused. Do not mix unrelated cleanup with the real change.
7. Add or update tests when behavior changes.
8. Update docs, prompts, or skills when the user-facing workflow changes.
## Contribution Rules
- Bugs, docs fixes, installer fixes, and focused workflow improvements are good PRs.
- Large feature changes should start with an issue or a concrete implementation discussion before code lands.
- Avoid refactor-only PRs unless they are necessary to unblock a real fix or requested by a maintainer.
- Do not silently change release behavior, installer behavior, or runtime defaults without documenting the reason in the PR.
- Use American English in docs, comments, prompts, UI copy, and examples.
- Do not add bundled prompts, skills, or docs whose primary purpose is to market, endorse, or funnel users toward a third-party product or service. Product integrations must be justified by user-facing utility and written in neutral language.
## Repo-Specific Checks
### Prompt and skill changes
- New workflows usually live in `prompts/*.md`.
- New reusable capabilities usually live in `skills/<name>/SKILL.md`.
- Keep skill files concise. Put detailed operational rules in the prompt or in focused reference files only when needed.
- If a new workflow should be invokable from the CLI, make sure its prompt frontmatter includes the correct metadata and that the command works through the normal prompt discovery path.
### Agent and artifact conventions
- `AGENTS.md` is the repo-level contract for workspace conventions, handoffs, provenance, and output naming.
- Long-running research flows should write plan artifacts to `outputs/.plans/` and use `CHANGELOG.md` as a lab notebook when the work is substantial.
- Do not update `CHANGELOG.md` for trivial one-shot changes.
### Release and versioning discipline
- The curl installer and release docs point users at tagged releases, not arbitrary commits on `main`.
- If you ship user-visible fixes after a tag, do not leave the repo in a state where `main` and the latest release advertise the same version string while containing different behavior.
- When changing release-sensitive behavior, check the version story across:
- `.nvmrc`
- `package.json`
- `website/package.json`
- `scripts/check-node-version.mjs`
- install docs in `README.md` and `website/src/content/docs/getting-started/installation.md`
## AI-Assisted Contributions
AI-assisted PRs are fine. The contributor is still responsible for the diff.
- Understand the code you are submitting.
- Run the local checks yourself instead of assuming generated code is correct.
- Include enough context in the PR description for a reviewer to understand the change quickly.
- If an agent updated prompts or skills, verify the instructions match the actual repo behavior.
## Review Expectations
- Explain what changed and why.
- Call out tradeoffs, follow-up work, and anything intentionally not handled.
- Include screenshots for UI changes.
- Resolve review comments you addressed before requesting review again.
## Good First Areas
Useful contributions usually land in one of these areas:
- installation and upgrade reliability
- research workflow quality
- model/provider setup ergonomics
- docs clarity
- preview and export stability
- packaging and release hygiene

View File

@@ -13,19 +13,59 @@
### Installation ### Installation
**macOS / Linux:**
```bash ```bash
curl -fsSL https://feynman.is/install | bash curl -fsSL https://feynman.is/install | bash
# stable release channel
curl -fsSL https://feynman.is/install | bash -s -- stable
# package manager fallback
pnpm add -g @companion-ai/feynman
bun add -g @companion-ai/feynman
``` ```
The one-line installer tracks the latest `main` build. Use `stable` or an exact version to pin a release. Then run `feynman setup` to configure your model and get started. **Windows (PowerShell):**
```powershell
irm https://feynman.is/install.ps1 | iex
```
The one-line installer fetches the latest tagged release. To pin a version, pass it explicitly, for example `curl -fsSL https://feynman.is/install | bash -s -- 0.2.17`.
The installer downloads a standalone native bundle with its own Node.js runtime.
Local models are supported through the custom-provider flow. For Ollama, run `feynman setup`, choose `Custom provider (baseUrl + API key)`, use `openai-completions`, and point it at `http://localhost:11434/v1`.
### Skills Only
If you want just the research skills without the full terminal app:
**macOS / Linux:**
```bash
curl -fsSL https://feynman.is/install-skills | bash
```
**Windows (PowerShell):**
```powershell
irm https://feynman.is/install-skills.ps1 | iex
```
That installs the skill library into `~/.codex/skills/feynman`.
For a repo-local install instead:
**macOS / Linux:**
```bash
curl -fsSL https://feynman.is/install-skills | bash -s -- --repo
```
**Windows (PowerShell):**
```powershell
& ([scriptblock]::Create((irm https://feynman.is/install-skills.ps1))) -Scope Repo
```
That installs into `.agents/skills/feynman` under the current repository.
These installers download the bundled `skills/` and `prompts/` trees plus the repo guidance files referenced by those skills. They do not install the Feynman terminal, bundled Node runtime, auth storage, or Pi packages.
--- ---
@@ -45,7 +85,7 @@ $ feynman audit 2401.12345
→ Compares paper claims against the public codebase → Compares paper claims against the public codebase
$ feynman replicate "chain-of-thought improves math" $ feynman replicate "chain-of-thought improves math"
Asks where to run, then builds a replication plan Replicates experiments on local or cloud GPUs
``` ```
--- ---
@@ -60,7 +100,7 @@ Ask naturally or use slash commands as shortcuts.
| `/lit <topic>` | Literature review from paper search and primary sources | | `/lit <topic>` | Literature review from paper search and primary sources |
| `/review <artifact>` | Simulated peer review with severity and revision plan | | `/review <artifact>` | Simulated peer review with severity and revision plan |
| `/audit <item>` | Paper vs. codebase mismatch audit | | `/audit <item>` | Paper vs. codebase mismatch audit |
| `/replicate <paper>` | Replication plan with environment selection | | `/replicate <paper>` | Replicate experiments on local or cloud GPUs |
| `/compare <topic>` | Source comparison matrix | | `/compare <topic>` | Source comparison matrix |
| `/draft <topic>` | Paper-style draft from research findings | | `/draft <topic>` | Paper-style draft from research findings |
| `/autoresearch <idea>` | Autonomous experiment loop | | `/autoresearch <idea>` | Autonomous experiment loop |
@@ -100,9 +140,16 @@ Built on [Pi](https://github.com/badlogic/pi-mono) for the agent runtime, [alpha
### Contributing ### Contributing
See [CONTRIBUTING.md](CONTRIBUTING.md) for the full contributor guide.
```bash ```bash
git clone https://github.com/getcompanion-ai/feynman.git git clone https://github.com/getcompanion-ai/feynman.git
cd feynman && pnpm install && pnpm start cd feynman
nvm use || nvm install
npm install
npm test
npm run typecheck
npm run build
``` ```
[Docs](https://feynman.is/docs) · [MIT License](LICENSE) [Docs](https://feynman.is/docs) · [MIT License](LICENSE)

View File

@@ -1,9 +1,36 @@
#!/usr/bin/env node #!/usr/bin/env node
const v = process.versions.node.split(".").map(Number); import { resolve } from "node:path";
if (v[0] < 20) { import { pathToFileURL } from "node:url";
console.error(`feynman requires Node.js 20 or later (you have ${process.versions.node})`);
console.error("upgrade: https://nodejs.org or nvm install 20"); const MIN_NODE_VERSION = "20.19.0";
function parseNodeVersion(version) {
const [major = "0", minor = "0", patch = "0"] = version.replace(/^v/, "").split(".");
return {
major: Number.parseInt(major, 10) || 0,
minor: Number.parseInt(minor, 10) || 0,
patch: Number.parseInt(patch, 10) || 0,
};
}
function compareNodeVersions(left, right) {
if (left.major !== right.major) return left.major - right.major;
if (left.minor !== right.minor) return left.minor - right.minor;
return left.patch - right.patch;
}
if (compareNodeVersions(parseNodeVersion(process.versions.node), parseNodeVersion(MIN_NODE_VERSION)) < 0) {
const isWindows = process.platform === "win32";
console.error(`feynman requires Node.js ${MIN_NODE_VERSION} or later (detected ${process.versions.node}).`);
console.error(isWindows
? "Install a newer Node.js from https://nodejs.org, or use the standalone installer:"
: "Switch to Node 20 with `nvm install 20 && nvm use 20`, or use the standalone installer:");
console.error(isWindows
? "irm https://feynman.is/install.ps1 | iex"
: "curl -fsSL https://feynman.is/install | bash");
process.exit(1); process.exit(1);
} }
await import("../scripts/patch-embedded-pi.mjs"); const here = import.meta.dirname;
await import("../dist/index.js");
await import(pathToFileURL(resolve(here, "..", "scripts", "patch-embedded-pi.mjs")).href);
await import(pathToFileURL(resolve(here, "..", "dist", "index.js")).href);

View File

@@ -1,8 +1,12 @@
import type { ExtensionAPI } from "@mariozechner/pi-coding-agent"; import type { ExtensionAPI } from "@mariozechner/pi-coding-agent";
import { registerAlphaTools } from "./research-tools/alpha.js";
import { registerDiscoveryCommands } from "./research-tools/discovery.js";
import { registerFeynmanModelCommand } from "./research-tools/feynman-model.js";
import { installFeynmanHeader } from "./research-tools/header.js"; import { installFeynmanHeader } from "./research-tools/header.js";
import { registerHelpCommand } from "./research-tools/help.js"; import { registerHelpCommand } from "./research-tools/help.js";
import { registerInitCommand, registerOutputsCommand } from "./research-tools/project.js"; import { registerInitCommand, registerOutputsCommand } from "./research-tools/project.js";
import { registerServiceTierControls } from "./research-tools/service-tier.js";
export default function researchTools(pi: ExtensionAPI): void { export default function researchTools(pi: ExtensionAPI): void {
const cache: { agentSummaryPromise?: Promise<{ agents: string[]; chains: string[] }> } = {}; const cache: { agentSummaryPromise?: Promise<{ agents: string[]; chains: string[] }> } = {};
@@ -15,7 +19,11 @@ export default function researchTools(pi: ExtensionAPI): void {
await installFeynmanHeader(pi, ctx, cache); await installFeynmanHeader(pi, ctx, cache);
}); });
registerAlphaTools(pi);
registerDiscoveryCommands(pi);
registerFeynmanModelCommand(pi);
registerHelpCommand(pi); registerHelpCommand(pi);
registerInitCommand(pi); registerInitCommand(pi);
registerOutputsCommand(pi); registerOutputsCommand(pi);
registerServiceTierControls(pi);
} }

View File

@@ -0,0 +1,107 @@
import {
askPaper,
annotatePaper,
clearPaperAnnotation,
getPaper,
listPaperAnnotations,
readPaperCode,
searchPapers,
} from "@companion-ai/alpha-hub/lib";
import type { ExtensionAPI } from "@mariozechner/pi-coding-agent";
import { Type } from "@sinclair/typebox";
function formatText(value: unknown): string {
if (typeof value === "string") return value;
return JSON.stringify(value, null, 2);
}
export function registerAlphaTools(pi: ExtensionAPI): void {
pi.registerTool({
name: "alpha_search",
label: "Alpha Search",
description:
"Search research papers through alphaXiv. Modes: semantic (default, use 2-3 sentence queries), keyword (exact terms), agentic (broad multi-turn retrieval), both, or all.",
parameters: Type.Object({
query: Type.String({ description: "Search query." }),
mode: Type.Optional(
Type.String({ description: "Search mode: semantic, keyword, both, agentic, or all." }),
),
}),
async execute(_toolCallId, params) {
const result = await searchPapers(params.query, params.mode?.trim() || "semantic");
return { content: [{ type: "text", text: formatText(result) }], details: result };
},
});
pi.registerTool({
name: "alpha_get_paper",
label: "Alpha Get Paper",
description: "Fetch a paper's AI-generated report (or raw full text) plus any local annotation.",
parameters: Type.Object({
paper: Type.String({ description: "arXiv ID, arXiv URL, or alphaXiv URL." }),
fullText: Type.Optional(Type.Boolean({ description: "Return raw full text instead of AI report." })),
}),
async execute(_toolCallId, params) {
const result = await getPaper(params.paper, { fullText: params.fullText });
return { content: [{ type: "text", text: formatText(result) }], details: result };
},
});
pi.registerTool({
name: "alpha_ask_paper",
label: "Alpha Ask Paper",
description: "Ask a targeted question about a paper. Uses AI to analyze the PDF and answer.",
parameters: Type.Object({
paper: Type.String({ description: "arXiv ID, arXiv URL, or alphaXiv URL." }),
question: Type.String({ description: "Question about the paper." }),
}),
async execute(_toolCallId, params) {
const result = await askPaper(params.paper, params.question);
return { content: [{ type: "text", text: formatText(result) }], details: result };
},
});
pi.registerTool({
name: "alpha_annotate_paper",
label: "Alpha Annotate Paper",
description: "Write or clear a persistent local annotation for a paper.",
parameters: Type.Object({
paper: Type.String({ description: "Paper ID (arXiv ID or URL)." }),
note: Type.Optional(Type.String({ description: "Annotation text. Omit when clear=true." })),
clear: Type.Optional(Type.Boolean({ description: "Clear the existing annotation." })),
}),
async execute(_toolCallId, params) {
const result = params.clear
? await clearPaperAnnotation(params.paper)
: params.note
? await annotatePaper(params.paper, params.note)
: (() => { throw new Error("Provide either note or clear=true."); })();
return { content: [{ type: "text", text: formatText(result) }], details: result };
},
});
pi.registerTool({
name: "alpha_list_annotations",
label: "Alpha List Annotations",
description: "List all persistent local paper annotations.",
parameters: Type.Object({}),
async execute() {
const result = await listPaperAnnotations();
return { content: [{ type: "text", text: formatText(result) }], details: result };
},
});
pi.registerTool({
name: "alpha_read_code",
label: "Alpha Read Code",
description: "Read files from a paper's GitHub repository. Use '/' for repo overview.",
parameters: Type.Object({
githubUrl: Type.String({ description: "GitHub repository URL." }),
path: Type.Optional(Type.String({ description: "File or directory path. Default: '/'" })),
}),
async execute(_toolCallId, params) {
const result = await readPaperCode(params.githubUrl, params.path?.trim() || "/");
return { content: [{ type: "text", text: formatText(result) }], details: result };
},
});
}

View File

@@ -0,0 +1,130 @@
import { existsSync, readFileSync } from "node:fs";
import { homedir } from "node:os";
import { resolve } from "node:path";
import type { ExtensionAPI, SlashCommandInfo, ToolInfo } from "@mariozechner/pi-coding-agent";
function resolveFeynmanSettingsPath(): string {
const configured = process.env.PI_CODING_AGENT_DIR?.trim();
const agentDir = configured
? configured.startsWith("~/")
? resolve(homedir(), configured.slice(2))
: resolve(configured)
: resolve(homedir(), ".feynman", "agent");
return resolve(agentDir, "settings.json");
}
function readConfiguredPackages(): string[] {
const settingsPath = resolveFeynmanSettingsPath();
if (!existsSync(settingsPath)) return [];
try {
const parsed = JSON.parse(readFileSync(settingsPath, "utf8")) as { packages?: unknown[] };
return Array.isArray(parsed.packages)
? parsed.packages
.map((entry) => {
if (typeof entry === "string") return entry;
if (!entry || typeof entry !== "object") return undefined;
const record = entry as { source?: unknown };
return typeof record.source === "string" ? record.source : undefined;
})
.filter((entry): entry is string => Boolean(entry))
: [];
} catch {
return [];
}
}
function formatSourceLabel(sourceInfo: { source: string; path: string }): string {
if (sourceInfo.source === "local") {
if (sourceInfo.path.includes("/prompts/")) return "workflow";
if (sourceInfo.path.includes("/extensions/")) return "extension";
return "local";
}
return sourceInfo.source.replace(/^npm:/, "").replace(/^git:/, "");
}
function formatCommandLine(command: SlashCommandInfo): string {
const source = formatSourceLabel(command.sourceInfo);
return `/${command.name}${command.description ?? ""} [${source}]`;
}
function summarizeToolParameters(tool: ToolInfo): string {
const properties =
tool.parameters &&
typeof tool.parameters === "object" &&
"properties" in tool.parameters &&
tool.parameters.properties &&
typeof tool.parameters.properties === "object"
? Object.keys(tool.parameters.properties as Record<string, unknown>)
: [];
return properties.length > 0 ? properties.join(", ") : "no parameters";
}
function formatToolLine(tool: ToolInfo): string {
const source = formatSourceLabel(tool.sourceInfo);
return `${tool.name}${tool.description ?? ""} [${source}]`;
}
export function registerDiscoveryCommands(pi: ExtensionAPI): void {
pi.registerCommand("commands", {
description: "Browse all available slash commands, including package and built-in commands.",
handler: async (_args, ctx) => {
const commands = pi
.getCommands()
.slice()
.sort((left, right) => left.name.localeCompare(right.name));
const items = commands.map((command) => formatCommandLine(command));
const selected = await ctx.ui.select("Slash Commands", items);
if (!selected) return;
ctx.ui.setEditorText(selected.split(" — ")[0] ?? "");
ctx.ui.notify(`Prefilled ${selected.split(" — ")[0]}`, "info");
},
});
pi.registerCommand("tools", {
description: "Browse all callable tools with their source and parameter summary.",
handler: async (_args, ctx) => {
const tools = pi
.getAllTools()
.slice()
.sort((left, right) => left.name.localeCompare(right.name));
const selected = await ctx.ui.select("Tools", tools.map((tool) => formatToolLine(tool)));
if (!selected) return;
const toolName = selected.split(" — ")[0] ?? selected;
const tool = tools.find((entry) => entry.name === toolName);
if (!tool) return;
ctx.ui.notify(`${tool.name}: ${summarizeToolParameters(tool)}`, "info");
},
});
pi.registerCommand("capabilities", {
description: "Show installed packages, discovery entrypoints, and high-level runtime capability counts.",
handler: async (_args, ctx) => {
const commands = pi.getCommands();
const tools = pi.getAllTools();
const workflows = commands.filter((command) => formatSourceLabel(command.sourceInfo) === "workflow");
const packages = readConfiguredPackages();
const items = [
`Commands: ${commands.length}`,
`Workflows: ${workflows.length}`,
`Tools: ${tools.length}`,
`Packages: ${packages.length}`,
"--- Discovery ---",
"/commands — browse slash commands",
"/tools — inspect callable tools",
"/hotkeys — view keyboard shortcuts",
"/service-tier — set request tier for supported providers",
"--- Installed Packages ---",
...packages.map((pkg) => pkg),
];
const selected = await ctx.ui.select("Capabilities", items);
if (!selected || selected.startsWith("---")) return;
if (selected.startsWith("/")) {
ctx.ui.setEditorText(selected.split(" — ")[0] ?? selected);
ctx.ui.notify(`Prefilled ${selected.split(" — ")[0]}`, "info");
}
},
});
}

View File

@@ -0,0 +1,309 @@
import { type Dirent, existsSync, readdirSync, readFileSync, writeFileSync } from "node:fs";
import { homedir } from "node:os";
import { basename, join, resolve } from "node:path";
import type { ExtensionAPI } from "@mariozechner/pi-coding-agent";
const FRONTMATTER_PATTERN = /^---\n([\s\S]*?)\n---\n?([\s\S]*)$/;
const INHERIT_MAIN = "__inherit_main__";
type FrontmatterDocument = {
lines: string[];
body: string;
eol: string;
trailingNewline: boolean;
};
type SubagentModelConfig = {
agent: string;
model?: string;
filePath: string;
};
type SelectOption<T> = {
label: string;
value: T;
};
type CommandContext = Parameters<Parameters<ExtensionAPI["registerCommand"]>[1]["handler"]>[1];
type TargetChoice =
| { type: "main" }
| { type: "subagent"; agent: string; model?: string };
function expandHomePath(value: string): string {
if (value === "~") return homedir();
if (value.startsWith("~/")) return resolve(homedir(), value.slice(2));
return value;
}
function resolveFeynmanAgentDir(): string {
const configured = process.env.PI_CODING_AGENT_DIR ?? process.env.FEYNMAN_CODING_AGENT_DIR;
if (configured?.trim()) {
return resolve(expandHomePath(configured.trim()));
}
return resolve(homedir(), ".feynman", "agent");
}
function formatModelSpec(model: { provider: string; id: string }): string {
return `${model.provider}/${model.id}`;
}
function detectEol(text: string): string {
return text.includes("\r\n") ? "\r\n" : "\n";
}
function normalizeLineEndings(text: string): string {
return text.replace(/\r\n/g, "\n");
}
function parseFrontmatterDocument(text: string): FrontmatterDocument | null {
const normalized = normalizeLineEndings(text);
const match = normalized.match(FRONTMATTER_PATTERN);
if (!match) return null;
return {
lines: match[1].split("\n"),
body: match[2] ?? "",
eol: detectEol(text),
trailingNewline: normalized.endsWith("\n"),
};
}
function serializeFrontmatterDocument(document: FrontmatterDocument): string {
const normalized = `---\n${document.lines.join("\n")}\n---\n${document.body}`;
const withTrailingNewline =
document.trailingNewline && !normalized.endsWith("\n") ? `${normalized}\n` : normalized;
return document.eol === "\n" ? withTrailingNewline : withTrailingNewline.replace(/\n/g, "\r\n");
}
function parseFrontmatterKey(line: string): string | undefined {
const match = line.match(/^\s*([A-Za-z0-9_-]+)\s*:/);
return match?.[1]?.toLowerCase();
}
function getFrontmatterValue(lines: string[], key: string): string | undefined {
const normalizedKey = key.toLowerCase();
for (const line of lines) {
const parsedKey = parseFrontmatterKey(line);
if (parsedKey !== normalizedKey) continue;
const separatorIndex = line.indexOf(":");
if (separatorIndex === -1) return undefined;
const value = line.slice(separatorIndex + 1).trim();
return value.length > 0 ? value : undefined;
}
return undefined;
}
function upsertFrontmatterValue(lines: string[], key: string, value: string): string[] {
const normalizedKey = key.toLowerCase();
const nextLines = [...lines];
const existingIndex = nextLines.findIndex((line) => parseFrontmatterKey(line) === normalizedKey);
const serialized = `${key}: ${value}`;
if (existingIndex !== -1) {
nextLines[existingIndex] = serialized;
return nextLines;
}
const descriptionIndex = nextLines.findIndex((line) => parseFrontmatterKey(line) === "description");
const nameIndex = nextLines.findIndex((line) => parseFrontmatterKey(line) === "name");
const insertIndex = descriptionIndex !== -1 ? descriptionIndex + 1 : nameIndex !== -1 ? nameIndex + 1 : nextLines.length;
nextLines.splice(insertIndex, 0, serialized);
return nextLines;
}
function removeFrontmatterKey(lines: string[], key: string): string[] {
const normalizedKey = key.toLowerCase();
return lines.filter((line) => parseFrontmatterKey(line) !== normalizedKey);
}
function normalizeAgentName(name: string): string {
return name.trim().toLowerCase();
}
function getAgentsDir(agentDir: string): string {
return join(agentDir, "agents");
}
function listAgentFiles(agentsDir: string): string[] {
if (!existsSync(agentsDir)) return [];
return readdirSync(agentsDir, { withFileTypes: true })
.filter((entry: Dirent) => (entry.isFile() || entry.isSymbolicLink()) && entry.name.endsWith(".md"))
.filter((entry) => !entry.name.endsWith(".chain.md"))
.map((entry) => join(agentsDir, entry.name));
}
function readAgentConfig(filePath: string): SubagentModelConfig {
const content = readFileSync(filePath, "utf8");
const parsed = parseFrontmatterDocument(content);
const fallbackName = basename(filePath, ".md");
if (!parsed) return { agent: fallbackName, filePath };
return {
agent: getFrontmatterValue(parsed.lines, "name") ?? fallbackName,
model: getFrontmatterValue(parsed.lines, "model"),
filePath,
};
}
function listSubagentModelConfigs(agentDir: string): SubagentModelConfig[] {
return listAgentFiles(getAgentsDir(agentDir))
.map((filePath) => readAgentConfig(filePath))
.sort((left, right) => left.agent.localeCompare(right.agent));
}
function findAgentConfig(configs: SubagentModelConfig[], agentName: string): SubagentModelConfig | undefined {
const normalized = normalizeAgentName(agentName);
return (
configs.find((config) => normalizeAgentName(config.agent) === normalized) ??
configs.find((config) => normalizeAgentName(basename(config.filePath, ".md")) === normalized)
);
}
function getAgentConfigOrThrow(agentDir: string, agentName: string): SubagentModelConfig {
const configs = listSubagentModelConfigs(agentDir);
const target = findAgentConfig(configs, agentName);
if (target) return target;
if (configs.length === 0) {
throw new Error(`No subagent definitions found in ${getAgentsDir(agentDir)}.`);
}
const availableAgents = configs.map((config) => config.agent).join(", ");
throw new Error(`Unknown subagent: ${agentName}. Available agents: ${availableAgents}`);
}
function setSubagentModel(agentDir: string, agentName: string, modelSpec: string): void {
const normalizedModelSpec = modelSpec.trim();
if (!normalizedModelSpec) throw new Error("Model spec cannot be empty.");
const target = getAgentConfigOrThrow(agentDir, agentName);
const content = readFileSync(target.filePath, "utf8");
const parsed = parseFrontmatterDocument(content);
if (!parsed) {
const eol = detectEol(content);
const injected = `---${eol}name: ${target.agent}${eol}model: ${normalizedModelSpec}${eol}---${eol}${content}`;
writeFileSync(target.filePath, injected, "utf8");
return;
}
const nextLines = upsertFrontmatterValue(parsed.lines, "model", normalizedModelSpec);
if (nextLines.join("\n") !== parsed.lines.join("\n")) {
writeFileSync(target.filePath, serializeFrontmatterDocument({ ...parsed, lines: nextLines }), "utf8");
}
}
function unsetSubagentModel(agentDir: string, agentName: string): void {
const target = getAgentConfigOrThrow(agentDir, agentName);
const content = readFileSync(target.filePath, "utf8");
const parsed = parseFrontmatterDocument(content);
if (!parsed) return;
const nextLines = removeFrontmatterKey(parsed.lines, "model");
if (nextLines.join("\n") !== parsed.lines.join("\n")) {
writeFileSync(target.filePath, serializeFrontmatterDocument({ ...parsed, lines: nextLines }), "utf8");
}
}
async function selectOption<T>(
ctx: CommandContext,
title: string,
options: SelectOption<T>[],
): Promise<T | undefined> {
const selected = await ctx.ui.select(
title,
options.map((option) => option.label),
);
if (!selected) return undefined;
return options.find((option) => option.label === selected)?.value;
}
export function registerFeynmanModelCommand(pi: ExtensionAPI): void {
pi.registerCommand("feynman-model", {
description: "Open Feynman model menu (main + per-subagent overrides).",
handler: async (_args, ctx) => {
if (!ctx.hasUI) {
ctx.ui.notify("feynman-model requires interactive mode.", "error");
return;
}
try {
ctx.modelRegistry.refresh();
const availableModels = [...ctx.modelRegistry.getAvailable()].sort((left, right) =>
formatModelSpec(left).localeCompare(formatModelSpec(right)),
);
if (availableModels.length === 0) {
ctx.ui.notify("No models available.", "error");
return;
}
const agentDir = resolveFeynmanAgentDir();
const subagentConfigs = listSubagentModelConfigs(agentDir);
const currentMain = ctx.model ? formatModelSpec(ctx.model) : "(none)";
const targetOptions: SelectOption<TargetChoice>[] = [
{ label: `main (default): ${currentMain}`, value: { type: "main" } },
...subagentConfigs.map((config) => ({
label: `${config.agent}: ${config.model ?? "default"}`,
value: { type: "subagent" as const, agent: config.agent, model: config.model },
})),
];
const target = await selectOption(ctx, "Choose target", targetOptions);
if (!target) return;
if (target.type === "main") {
const selectedModel = await selectOption(
ctx,
"Select main model",
availableModels.map((model) => {
const spec = formatModelSpec(model);
const suffix = spec === currentMain ? " (current)" : "";
return { label: `${spec}${suffix}`, value: model };
}),
);
if (!selectedModel) return;
const success = await pi.setModel(selectedModel);
if (!success) {
ctx.ui.notify(`No API key found for ${selectedModel.provider}.`, "error");
return;
}
ctx.ui.notify(`Main model set to ${formatModelSpec(selectedModel)}.`, "info");
return;
}
const selectedSubagentModel = await selectOption(
ctx,
`Select model for ${target.agent}`,
[
{
label: target.model ? "(inherit main default)" : "(inherit main default) (current)",
value: INHERIT_MAIN,
},
...availableModels.map((model) => {
const spec = formatModelSpec(model);
const suffix = spec === target.model ? " (current)" : "";
return { label: `${spec}${suffix}`, value: spec };
}),
],
);
if (!selectedSubagentModel) return;
if (selectedSubagentModel === INHERIT_MAIN) {
unsetSubagentModel(agentDir, target.agent);
ctx.ui.notify(`${target.agent} now inherits the main model.`, "info");
return;
}
setSubagentModel(agentDir, target.agent, selectedSubagentModel);
ctx.ui.notify(`${target.agent} model set to ${selectedSubagentModel}.`, "info");
} catch (error) {
ctx.ui.notify(error instanceof Error ? error.message : String(error), "error");
}
},
});
}

View File

@@ -4,6 +4,7 @@ import { execSync } from "node:child_process";
import { resolve as resolvePath } from "node:path"; import { resolve as resolvePath } from "node:path";
import type { ExtensionAPI, ExtensionContext } from "@mariozechner/pi-coding-agent"; import type { ExtensionAPI, ExtensionContext } from "@mariozechner/pi-coding-agent";
import { truncateToWidth, visibleWidth } from "@mariozechner/pi-tui";
import { import {
APP_ROOT, APP_ROOT,
@@ -11,10 +12,8 @@ import {
FEYNMAN_VERSION, FEYNMAN_VERSION,
} from "./shared.js"; } from "./shared.js";
const ANSI_RE = /\x1b\[[0-9;]*m/g;
function visibleLength(text: string): number { function visibleLength(text: string): number {
return text.replace(ANSI_RE, "").length; return visibleWidth(text);
} }
function formatHeaderPath(path: string): string { function formatHeaderPath(path: string): string {
@@ -23,10 +22,8 @@ function formatHeaderPath(path: string): string {
} }
function truncateVisible(text: string, maxVisible: number): string { function truncateVisible(text: string, maxVisible: number): string {
const raw = text.replace(ANSI_RE, ""); if (visibleWidth(text) <= maxVisible) return text;
if (raw.length <= maxVisible) return text; return truncateToWidth(text, maxVisible, maxVisible <= 3 ? "" : "...");
if (maxVisible <= 3) return ".".repeat(maxVisible);
return `${raw.slice(0, maxVisible - 3)}...`;
} }
function wrapWords(text: string, maxW: number): string[] { function wrapWords(text: string, maxW: number): string[] {
@@ -34,12 +31,12 @@ function wrapWords(text: string, maxW: number): string[] {
const lines: string[] = []; const lines: string[] = [];
let cur = ""; let cur = "";
for (let word of words) { for (let word of words) {
if (word.length > maxW) { if (visibleWidth(word) > maxW) {
if (cur) { lines.push(cur); cur = ""; } if (cur) { lines.push(cur); cur = ""; }
word = maxW > 3 ? `${word.slice(0, maxW - 1)}` : word.slice(0, maxW); word = truncateToWidth(word, maxW, maxW > 3 ? "…" : "");
} }
const test = cur ? `${cur} ${word}` : word; const test = cur ? `${cur} ${word}` : word;
if (cur && test.length > maxW) { if (cur && visibleWidth(test) > maxW) {
lines.push(cur); lines.push(cur);
cur = word; cur = word;
} else { } else {
@@ -56,9 +53,10 @@ function padRight(text: string, width: number): string {
} }
function centerText(text: string, width: number): string { function centerText(text: string, width: number): string {
if (text.length >= width) return text.slice(0, width); const textWidth = visibleWidth(text);
const left = Math.floor((width - text.length) / 2); if (textWidth >= width) return truncateToWidth(text, width, "");
const right = width - text.length - left; const left = Math.floor((width - textWidth) / 2);
const right = width - textWidth - left;
return `${" ".repeat(left)}${text}${" ".repeat(right)}`; return `${" ".repeat(left)}${text}${" ".repeat(right)}`;
} }
@@ -287,8 +285,8 @@ export function installFeynmanHeader(
if (activity) { if (activity) {
const maxActivityLen = leftW * 2; const maxActivityLen = leftW * 2;
const trimmed = activity.length > maxActivityLen const trimmed = visibleWidth(activity) > maxActivityLen
? `${activity.slice(0, maxActivityLen - 1)}` ? truncateToWidth(activity, maxActivityLen, "…")
: activity; : activity;
leftLines.push(""); leftLines.push("");
leftLines.push(theme.fg("accent", theme.bold("Last Activity"))); leftLines.push(theme.fg("accent", theme.bold("Last Activity")));

View File

@@ -0,0 +1,174 @@
import { homedir } from "node:os";
import { readFileSync, writeFileSync } from "node:fs";
import { resolve } from "node:path";
import type { ExtensionAPI } from "@mariozechner/pi-coding-agent";
const FEYNMAN_SERVICE_TIERS = [
"auto",
"default",
"flex",
"priority",
"standard_only",
] as const;
type FeynmanServiceTier = (typeof FEYNMAN_SERVICE_TIERS)[number];
const SERVICE_TIER_SET = new Set<string>(FEYNMAN_SERVICE_TIERS);
const OPENAI_SERVICE_TIERS = new Set<FeynmanServiceTier>(["auto", "default", "flex", "priority"]);
const ANTHROPIC_SERVICE_TIERS = new Set<FeynmanServiceTier>(["auto", "standard_only"]);
type CommandContext = Parameters<Parameters<ExtensionAPI["registerCommand"]>[1]["handler"]>[1];
type SelectOption<T> = {
label: string;
value: T;
};
function resolveFeynmanSettingsPath(): string {
const configured = process.env.PI_CODING_AGENT_DIR?.trim();
const agentDir = configured
? configured.startsWith("~/")
? resolve(homedir(), configured.slice(2))
: resolve(configured)
: resolve(homedir(), ".feynman", "agent");
return resolve(agentDir, "settings.json");
}
function normalizeServiceTier(value: string | undefined): FeynmanServiceTier | undefined {
if (!value) return undefined;
const normalized = value.trim().toLowerCase();
return SERVICE_TIER_SET.has(normalized) ? (normalized as FeynmanServiceTier) : undefined;
}
function getConfiguredServiceTier(settingsPath: string): FeynmanServiceTier | undefined {
try {
const parsed = JSON.parse(readFileSync(settingsPath, "utf8")) as { serviceTier?: string };
return normalizeServiceTier(parsed.serviceTier);
} catch {
return undefined;
}
}
function setConfiguredServiceTier(settingsPath: string, tier: FeynmanServiceTier | undefined): void {
let settings: Record<string, unknown> = {};
try {
settings = JSON.parse(readFileSync(settingsPath, "utf8")) as Record<string, unknown>;
} catch {}
if (tier) {
settings.serviceTier = tier;
} else {
delete settings.serviceTier;
}
writeFileSync(settingsPath, JSON.stringify(settings, null, 2) + "\n", "utf8");
}
function resolveActiveServiceTier(settingsPath: string): FeynmanServiceTier | undefined {
return normalizeServiceTier(process.env.FEYNMAN_SERVICE_TIER) ?? getConfiguredServiceTier(settingsPath);
}
function resolveProviderServiceTier(
provider: string | undefined,
tier: FeynmanServiceTier | undefined,
): FeynmanServiceTier | undefined {
if (!provider || !tier) return undefined;
if ((provider === "openai" || provider === "openai-codex") && OPENAI_SERVICE_TIERS.has(tier)) {
return tier;
}
if (provider === "anthropic" && ANTHROPIC_SERVICE_TIERS.has(tier)) {
return tier;
}
return undefined;
}
async function selectOption<T>(
ctx: CommandContext,
title: string,
options: SelectOption<T>[],
): Promise<T | undefined> {
const selected = await ctx.ui.select(
title,
options.map((option) => option.label),
);
if (!selected) return undefined;
return options.find((option) => option.label === selected)?.value;
}
function parseRequestedTier(rawArgs: string): FeynmanServiceTier | null | undefined {
const trimmed = rawArgs.trim();
if (!trimmed) return undefined;
if (trimmed === "unset" || trimmed === "clear" || trimmed === "off") return null;
return normalizeServiceTier(trimmed);
}
export function registerServiceTierControls(pi: ExtensionAPI): void {
pi.on("before_provider_request", (event, ctx) => {
if (!ctx.model || !event.payload || typeof event.payload !== "object") {
return;
}
const activeTier = resolveActiveServiceTier(resolveFeynmanSettingsPath());
const providerTier = resolveProviderServiceTier(ctx.model.provider, activeTier);
if (!providerTier) {
return;
}
return {
...(event.payload as Record<string, unknown>),
service_tier: providerTier,
};
});
pi.registerCommand("service-tier", {
description: "View or set the provider service tier override used for supported models.",
handler: async (args, ctx) => {
const settingsPath = resolveFeynmanSettingsPath();
const requested = parseRequestedTier(args);
if (requested === undefined && !args.trim()) {
if (!ctx.hasUI) {
ctx.ui.notify(getConfiguredServiceTier(settingsPath) ?? "not set", "info");
return;
}
const current = getConfiguredServiceTier(settingsPath);
const selected = await selectOption(
ctx,
"Select service tier",
[
{ label: current ? `unset (current: ${current})` : "unset (current)", value: null },
...FEYNMAN_SERVICE_TIERS.map((tier) => ({
label: tier === current ? `${tier} (current)` : tier,
value: tier,
})),
],
);
if (selected === undefined) return;
if (selected === null) {
setConfiguredServiceTier(settingsPath, undefined);
ctx.ui.notify("Cleared service tier override.", "info");
return;
}
setConfiguredServiceTier(settingsPath, selected);
ctx.ui.notify(`Service tier set to ${selected}.`, "info");
return;
}
if (requested === null) {
setConfiguredServiceTier(settingsPath, undefined);
ctx.ui.notify("Cleared service tier override.", "info");
return;
}
if (!requested) {
ctx.ui.notify("Use auto, default, flex, priority, standard_only, or unset.", "error");
return;
}
setConfiguredServiceTier(settingsPath, requested);
ctx.ui.notify(`Service tier set to ${requested}.`, "info");
},
});
}

View File

@@ -35,9 +35,14 @@ export function readPromptSpecs(appRoot) {
} }
export const extensionCommandSpecs = [ export const extensionCommandSpecs = [
{ name: "capabilities", args: "", section: "Project & Session", description: "Show installed packages, discovery entrypoints, and runtime capability counts.", publicDocs: true },
{ name: "commands", args: "", section: "Project & Session", description: "Browse all available slash commands, including built-in and package commands.", publicDocs: true },
{ name: "help", args: "", section: "Project & Session", description: "Show grouped Feynman commands and prefill the editor with a selected command.", publicDocs: true }, { name: "help", args: "", section: "Project & Session", description: "Show grouped Feynman commands and prefill the editor with a selected command.", publicDocs: true },
{ name: "feynman-model", args: "", section: "Project & Session", description: "Open Feynman model menu (main + per-subagent overrides).", publicDocs: true },
{ name: "init", args: "", section: "Project & Session", description: "Bootstrap AGENTS.md and session-log folders for a research project.", publicDocs: true }, { name: "init", args: "", section: "Project & Session", description: "Bootstrap AGENTS.md and session-log folders for a research project.", publicDocs: true },
{ name: "outputs", args: "", section: "Project & Session", description: "Browse all research artifacts (papers, outputs, experiments, notes).", publicDocs: true }, { name: "outputs", args: "", section: "Project & Session", description: "Browse all research artifacts (papers, outputs, experiments, notes).", publicDocs: true },
{ name: "service-tier", args: "", section: "Project & Session", description: "View or set the provider service tier override for supported models.", publicDocs: true },
{ name: "tools", args: "", section: "Project & Session", description: "Browse all callable tools with their source and parameter summary.", publicDocs: true },
]; ];
export const livePackageCommandGroups = [ export const livePackageCommandGroups = [
@@ -57,6 +62,7 @@ export const livePackageCommandGroups = [
{ name: "schedule-prompt", usage: "/schedule-prompt" }, { name: "schedule-prompt", usage: "/schedule-prompt" },
{ name: "search", usage: "/search" }, { name: "search", usage: "/search" },
{ name: "preview", usage: "/preview" }, { name: "preview", usage: "/preview" },
{ name: "hotkeys", usage: "/hotkeys" },
{ name: "new", usage: "/new" }, { name: "new", usage: "/new" },
{ name: "quit", usage: "/quit" }, { name: "quit", usage: "/quit" },
{ name: "exit", usage: "/exit" }, { name: "exit", usage: "/exit" },
@@ -83,6 +89,7 @@ export const cliCommandSections = [
{ usage: "feynman model login [id]", description: "Login to a Pi OAuth model provider." }, { usage: "feynman model login [id]", description: "Login to a Pi OAuth model provider." },
{ usage: "feynman model logout [id]", description: "Logout from a Pi OAuth model provider." }, { usage: "feynman model logout [id]", description: "Logout from a Pi OAuth model provider." },
{ usage: "feynman model set <provider/model>", description: "Set the default model." }, { usage: "feynman model set <provider/model>", description: "Set the default model." },
{ usage: "feynman model tier [value]", description: "View or set the request service tier override." },
], ],
}, },
{ {
@@ -99,6 +106,8 @@ export const cliCommandSections = [
{ usage: "feynman packages list", description: "Show core and optional Pi package presets." }, { usage: "feynman packages list", description: "Show core and optional Pi package presets." },
{ usage: "feynman packages install <preset>", description: "Install optional package presets on demand." }, { usage: "feynman packages install <preset>", description: "Install optional package presets on demand." },
{ usage: "feynman search status", description: "Show Pi web-access status and config path." }, { usage: "feynman search status", description: "Show Pi web-access status and config path." },
{ usage: "feynman search set <provider> [api-key]", description: "Set the web search provider and optionally save its API key." },
{ usage: "feynman search clear", description: "Reset web search provider to auto while preserving API keys." },
{ usage: "feynman update [package]", description: "Update installed packages, or a specific package." }, { usage: "feynman update [package]", description: "Update installed packages, or a specific package." },
], ],
}, },
@@ -110,6 +119,7 @@ export const legacyFlags = [
{ usage: "--alpha-logout", description: "Clear alphaXiv auth and exit." }, { usage: "--alpha-logout", description: "Clear alphaXiv auth and exit." },
{ usage: "--alpha-status", description: "Show alphaXiv auth status and exit." }, { usage: "--alpha-status", description: "Show alphaXiv auth status and exit." },
{ usage: "--model <provider:model>", description: "Force a specific model." }, { usage: "--model <provider:model>", description: "Force a specific model." },
{ usage: "--service-tier <tier>", description: "Override request service tier for this run." },
{ usage: "--thinking <level>", description: "Set thinking level: off | minimal | low | medium | high | xhigh." }, { usage: "--thinking <level>", description: "Set thinking level: off | minimal | low | medium | high | xhigh." },
{ usage: "--cwd <path>", description: "Set the working directory for tools." }, { usage: "--cwd <path>", description: "Set the working directory for tools." },
{ usage: "--session-dir <path>", description: "Set the session storage directory." }, { usage: "--session-dir <path>", description: "Set the session storage directory." },

74
package-lock.json generated
View File

@@ -1,17 +1,18 @@
{ {
"name": "@companion-ai/feynman", "name": "@companion-ai/feynman",
"version": "0.2.13", "version": "0.2.17",
"lockfileVersion": 3, "lockfileVersion": 3,
"requires": true, "requires": true,
"packages": { "packages": {
"": { "": {
"name": "@companion-ai/feynman", "name": "@companion-ai/feynman",
"version": "0.2.13", "version": "0.2.17",
"hasInstallScript": true,
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@companion-ai/alpha-hub": "^0.1.2", "@companion-ai/alpha-hub": "^0.1.2",
"@mariozechner/pi-ai": "^0.62.0", "@mariozechner/pi-ai": "^0.64.0",
"@mariozechner/pi-coding-agent": "^0.62.0", "@mariozechner/pi-coding-agent": "^0.64.0",
"@sinclair/typebox": "^0.34.48", "@sinclair/typebox": "^0.34.48",
"dotenv": "^17.3.1" "dotenv": "^17.3.1"
}, },
@@ -24,7 +25,7 @@
"typescript": "^5.9.3" "typescript": "^5.9.3"
}, },
"engines": { "engines": {
"node": ">=20.18.1" "node": ">=20.19.0"
} }
}, },
"node_modules/@anthropic-ai/sdk": { "node_modules/@anthropic-ai/sdk": {
@@ -1264,9 +1265,9 @@
} }
}, },
"node_modules/@hono/node-server": { "node_modules/@hono/node-server": {
"version": "1.19.11", "version": "1.19.13",
"resolved": "https://registry.npmjs.org/@hono/node-server/-/node-server-1.19.11.tgz", "resolved": "https://registry.npmjs.org/@hono/node-server/-/node-server-1.19.13.tgz",
"integrity": "sha512-dr8/3zEaB+p0D2n/IUrlPF1HZm586qgJNXK1a9fhg/PzdtkK7Ksd5l312tJX2yBuALqDYBlG20QEbayqPyxn+g==", "integrity": "sha512-TsQLe4i2gvoTtrHje625ngThGBySOgSK3Xo2XRYOdqGN1teR8+I7vchQC46uLJi8OF62YTYA3AhSpumtkhsaKQ==",
"license": "MIT", "license": "MIT",
"engines": { "engines": {
"node": ">=18.14.1" "node": ">=18.14.1"
@@ -1468,21 +1469,21 @@
} }
}, },
"node_modules/@mariozechner/pi-agent-core": { "node_modules/@mariozechner/pi-agent-core": {
"version": "0.62.0", "version": "0.64.0",
"resolved": "https://registry.npmjs.org/@mariozechner/pi-agent-core/-/pi-agent-core-0.62.0.tgz", "resolved": "https://registry.npmjs.org/@mariozechner/pi-agent-core/-/pi-agent-core-0.64.0.tgz",
"integrity": "sha512-SBjqgDrgKOaC+IGzFGB3jXQErv9H1QMYnWFvUg6ra6dG0ZgWFBUZb6unidngWLsmaxSDWes6KeKiVFMsr2VSEQ==", "integrity": "sha512-IN/sIxWOD0v1OFVXHB605SGiZhO5XdEWG5dO8EAV08n3jz/p12o4OuYGvhGXmHhU28WXa/FGWC+FO5xiIih8Uw==",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@mariozechner/pi-ai": "^0.62.0" "@mariozechner/pi-ai": "^0.64.0"
}, },
"engines": { "engines": {
"node": ">=20.0.0" "node": ">=20.0.0"
} }
}, },
"node_modules/@mariozechner/pi-ai": { "node_modules/@mariozechner/pi-ai": {
"version": "0.62.0", "version": "0.64.0",
"resolved": "https://registry.npmjs.org/@mariozechner/pi-ai/-/pi-ai-0.62.0.tgz", "resolved": "https://registry.npmjs.org/@mariozechner/pi-ai/-/pi-ai-0.64.0.tgz",
"integrity": "sha512-mJgryZ5RgBQG++tiETMtCQQJoH2MAhKetCfqI98NMvGydu7L9x2qC2JekQlRaAgIlTgv4MRH1UXHMEs4UweE/Q==", "integrity": "sha512-Z/Jnf+JSVDPLRcxJsa8XhYTJKIqKekNueaCpBLGQHgizL1F9RQ1Rur3rIfZpfXkt2cLu/AIPtOs223ueuoWaWg==",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@anthropic-ai/sdk": "^0.73.0", "@anthropic-ai/sdk": "^0.73.0",
@@ -1507,16 +1508,17 @@
} }
}, },
"node_modules/@mariozechner/pi-coding-agent": { "node_modules/@mariozechner/pi-coding-agent": {
"version": "0.62.0", "version": "0.64.0",
"resolved": "https://registry.npmjs.org/@mariozechner/pi-coding-agent/-/pi-coding-agent-0.62.0.tgz", "resolved": "https://registry.npmjs.org/@mariozechner/pi-coding-agent/-/pi-coding-agent-0.64.0.tgz",
"integrity": "sha512-f1NnExqsHuA6w8UVlBtPsvTBhdkMc0h1JD9SzGCdWTLou5GHJr2JIP6DlwV9IKWAnM+sAelaoFez+14wLP2zOQ==", "integrity": "sha512-Q4tcqSqFGQtOgCtRyIp1D80Nv2if13Q2pfbnrOlaT/mix90mLcZGML9jKVnT1jGSy5GMYudU1HsS7cx53kxb0g==",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@mariozechner/jiti": "^2.6.2", "@mariozechner/jiti": "^2.6.2",
"@mariozechner/pi-agent-core": "^0.62.0", "@mariozechner/pi-agent-core": "^0.64.0",
"@mariozechner/pi-ai": "^0.62.0", "@mariozechner/pi-ai": "^0.64.0",
"@mariozechner/pi-tui": "^0.62.0", "@mariozechner/pi-tui": "^0.64.0",
"@silvia-odwyer/photon-node": "^0.3.4", "@silvia-odwyer/photon-node": "^0.3.4",
"ajv": "^8.17.1",
"chalk": "^5.5.0", "chalk": "^5.5.0",
"cli-highlight": "^2.1.11", "cli-highlight": "^2.1.11",
"diff": "^8.0.2", "diff": "^8.0.2",
@@ -1543,9 +1545,9 @@
} }
}, },
"node_modules/@mariozechner/pi-tui": { "node_modules/@mariozechner/pi-tui": {
"version": "0.62.0", "version": "0.64.0",
"resolved": "https://registry.npmjs.org/@mariozechner/pi-tui/-/pi-tui-0.62.0.tgz", "resolved": "https://registry.npmjs.org/@mariozechner/pi-tui/-/pi-tui-0.64.0.tgz",
"integrity": "sha512-/At11PPe8l319MnUoK4wN5L/uVCU6bDdiIUzH8Ez0stOkjSF6isRXScZ+RMM+6iCKsD4muBTX8Cmcif+3/UWHA==", "integrity": "sha512-W1qLry9MAuN/V3YJmMv/BJa0VaYv721NkXPg/DGItdqWxuDc+1VdNbyAnRwxblNkIpXVUWL26x64BlyFXpxmkg==",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"@types/mime-types": "^2.1.4", "@types/mime-types": "^2.1.4",
@@ -2528,9 +2530,9 @@
"license": "MIT" "license": "MIT"
}, },
"node_modules/basic-ftp": { "node_modules/basic-ftp": {
"version": "5.2.0", "version": "5.2.1",
"resolved": "https://registry.npmjs.org/basic-ftp/-/basic-ftp-5.2.0.tgz", "resolved": "https://registry.npmjs.org/basic-ftp/-/basic-ftp-5.2.1.tgz",
"integrity": "sha512-VoMINM2rqJwJgfdHq6RiUudKt2BV+FY5ZFezP/ypmwayk68+NzzAQy4XXLlqsGD4MCzq3DrmNFD/uUmBJuGoXw==", "integrity": "sha512-0yaL8JdxTknKDILitVpfYfV2Ob6yb3udX/hK97M7I3jOeznBNxQPtVvTUtnhUkyHlxFWyr5Lvknmgzoc7jf+1Q==",
"license": "MIT", "license": "MIT",
"engines": { "engines": {
"node": ">=10.0.0" "node": ">=10.0.0"
@@ -2576,9 +2578,9 @@
"license": "MIT" "license": "MIT"
}, },
"node_modules/brace-expansion": { "node_modules/brace-expansion": {
"version": "5.0.4", "version": "5.0.5",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-5.0.4.tgz", "resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-5.0.5.tgz",
"integrity": "sha512-h+DEnpVvxmfVefa4jFbCf5HdH5YMDXRsmKflpf1pILZWRFlTbJpxeU55nJl4Smt5HQaGzg1o6RHFPJaOqnmBDg==", "integrity": "sha512-VZznLgtwhn+Mact9tfiwx64fA9erHH/MCXEUfB/0bX/6Fz6ny5EGTXYltMocqg4xFAQZtnO3DHWWXi8RiuN7cQ==",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"balanced-match": "^4.0.2" "balanced-match": "^4.0.2"
@@ -3621,9 +3623,9 @@
} }
}, },
"node_modules/hono": { "node_modules/hono": {
"version": "4.12.9", "version": "4.12.12",
"resolved": "https://registry.npmjs.org/hono/-/hono-4.12.9.tgz", "resolved": "https://registry.npmjs.org/hono/-/hono-4.12.12.tgz",
"integrity": "sha512-wy3T8Zm2bsEvxKZM5w21VdHDDcwVS1yUFFY6i8UobSsKfFceT7TOwhbhfKsDyx7tYQlmRM5FLpIuYvNFyjctiA==", "integrity": "sha512-p1JfQMKaceuCbpJKAPKVqyqviZdS0eUxH9v82oWo1kb9xjQ5wA6iP3FNVAPDFlz5/p7d45lO+BpSk1tuSZMF4Q==",
"license": "MIT", "license": "MIT",
"engines": { "engines": {
"node": ">=16.9.0" "node": ">=16.9.0"
@@ -4216,9 +4218,9 @@
} }
}, },
"node_modules/path-to-regexp": { "node_modules/path-to-regexp": {
"version": "8.3.0", "version": "8.4.2",
"resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-8.3.0.tgz", "resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-8.4.2.tgz",
"integrity": "sha512-7jdwVIRtsP8MYpdXSwOS0YdD0Du+qOoF/AEPIt88PcCFrZCzx41oxku1jD88hZBwbNUIEfpqvuhjFaMAqMTWnA==", "integrity": "sha512-qRcuIdP69NPm4qbACK+aDogI5CBDMi1jKe0ry5rSQJz8JVLsC7jV8XpiJjGRLLol3N+R5ihGYcrPLTno6pAdBA==",
"license": "MIT", "license": "MIT",
"funding": { "funding": {
"type": "opencollective", "type": "opencollective",

View File

@@ -1,11 +1,11 @@
{ {
"name": "@companion-ai/feynman", "name": "@companion-ai/feynman",
"version": "0.2.13", "version": "0.2.17",
"description": "Research-first CLI agent built on Pi and alphaXiv", "description": "Research-first CLI agent built on Pi and alphaXiv",
"license": "MIT", "license": "MIT",
"type": "module", "type": "module",
"engines": { "engines": {
"node": ">=20.18.1" "node": ">=20.19.0"
}, },
"bin": { "bin": {
"feynman": "bin/feynman.js" "feynman": "bin/feynman.js"
@@ -26,14 +26,16 @@
"scripts/", "scripts/",
"skills/", "skills/",
"AGENTS.md", "AGENTS.md",
"CONTRIBUTING.md",
"README.md", "README.md",
".env.example" ".env.example"
], ],
"scripts": { "scripts": {
"preinstall": "node ./scripts/check-node-version.mjs",
"build": "tsc -p tsconfig.build.json", "build": "tsc -p tsconfig.build.json",
"build:native-bundle": "node ./scripts/build-native-bundle.mjs", "build:native-bundle": "node ./scripts/build-native-bundle.mjs",
"dev": "tsx src/index.ts", "dev": "tsx src/index.ts",
"prepack": "npm run build && node ./scripts/prepare-runtime-workspace.mjs", "prepack": "node ./scripts/clean-publish-artifacts.mjs && npm run build && node ./scripts/prepare-runtime-workspace.mjs",
"start": "tsx src/index.ts", "start": "tsx src/index.ts",
"start:dist": "node ./bin/feynman.js", "start:dist": "node ./bin/feynman.js",
"test": "node --import tsx --test --test-concurrency=1 tests/*.test.ts", "test": "node --import tsx --test --test-concurrency=1 tests/*.test.ts",
@@ -58,11 +60,33 @@
}, },
"dependencies": { "dependencies": {
"@companion-ai/alpha-hub": "^0.1.2", "@companion-ai/alpha-hub": "^0.1.2",
"@mariozechner/pi-ai": "^0.62.0", "@mariozechner/pi-ai": "^0.64.0",
"@mariozechner/pi-coding-agent": "^0.62.0", "@mariozechner/pi-coding-agent": "^0.64.0",
"@sinclair/typebox": "^0.34.48", "@sinclair/typebox": "^0.34.48",
"dotenv": "^17.3.1" "dotenv": "^17.3.1"
}, },
"overrides": {
"basic-ftp": "5.2.1",
"@modelcontextprotocol/sdk": {
"@hono/node-server": "1.19.13",
"hono": "4.12.12"
},
"express": {
"router": {
"path-to-regexp": "8.4.2"
}
},
"proxy-agent": {
"pac-proxy-agent": {
"get-uri": {
"basic-ftp": "5.2.1"
}
}
},
"minimatch": {
"brace-expansion": "5.0.5"
}
},
"devDependencies": { "devDependencies": {
"@types/node": "^25.5.0", "@types/node": "^25.5.0",
"tsx": "^4.21.0", "tsx": "^4.21.0",

View File

@@ -6,13 +6,45 @@ import { spawnSync } from "node:child_process";
const appRoot = resolve(import.meta.dirname, ".."); const appRoot = resolve(import.meta.dirname, "..");
const packageJson = JSON.parse(readFileSync(resolve(appRoot, "package.json"), "utf8")); const packageJson = JSON.parse(readFileSync(resolve(appRoot, "package.json"), "utf8"));
const packageLockPath = resolve(appRoot, "package-lock.json"); const packageLockPath = resolve(appRoot, "package-lock.json");
const bundledNodeVersion = process.env.FEYNMAN_BUNDLED_NODE_VERSION ?? process.version.slice(1); const minBundledNodeVersion = packageJson.engines?.node?.replace(/^>=/, "").trim() || process.version.slice(1);
function parseSemver(version) {
const [major = "0", minor = "0", patch = "0"] = version.split(".");
return [Number.parseInt(major, 10) || 0, Number.parseInt(minor, 10) || 0, Number.parseInt(patch, 10) || 0];
}
function compareSemver(left, right) {
for (let index = 0; index < 3; index += 1) {
const diff = left[index] - right[index];
if (diff !== 0) return diff;
}
return 0;
}
function fail(message) { function fail(message) {
console.error(`[feynman] ${message}`); console.error(`[feynman] ${message}`);
process.exit(1); process.exit(1);
} }
function resolveBundledNodeVersion() {
const requestedNodeVersion = process.env.FEYNMAN_BUNDLED_NODE_VERSION?.trim();
if (requestedNodeVersion) {
if (compareSemver(parseSemver(requestedNodeVersion), parseSemver(minBundledNodeVersion)) < 0) {
fail(
`FEYNMAN_BUNDLED_NODE_VERSION=${requestedNodeVersion} is below the supported floor ${minBundledNodeVersion}`,
);
}
return requestedNodeVersion;
}
const currentNodeVersion = process.version.slice(1);
return compareSemver(parseSemver(currentNodeVersion), parseSemver(minBundledNodeVersion)) < 0
? minBundledNodeVersion
: currentNodeVersion;
}
const bundledNodeVersion = resolveBundledNodeVersion();
function resolveCommand(command) { function resolveCommand(command) {
if (process.platform === "win32" && command === "npm") { if (process.platform === "win32" && command === "npm") {
return "npm.cmd"; return "npm.cmd";
@@ -243,7 +275,8 @@ function writeLauncher(bundleRoot, target) {
"@echo off", "@echo off",
"setlocal", "setlocal",
'set "ROOT=%~dp0"', 'set "ROOT=%~dp0"',
'"%ROOT%node\\node.exe" "%ROOT%app\\bin\\feynman.js" %*', 'if "%ROOT:~-1%"=="\\" set "ROOT=%ROOT:~0,-1%"',
'"%ROOT%\\node\\node.exe" "%ROOT%\\app\\bin\\feynman.js" %*',
"", "",
].join("\r\n"), ].join("\r\n"),
"utf8", "utf8",

View File

@@ -0,0 +1,40 @@
const MIN_NODE_VERSION = "20.19.0";
function parseNodeVersion(version) {
const [major = "0", minor = "0", patch = "0"] = version.replace(/^v/, "").split(".");
return {
major: Number.parseInt(major, 10) || 0,
minor: Number.parseInt(minor, 10) || 0,
patch: Number.parseInt(patch, 10) || 0,
};
}
function compareNodeVersions(left, right) {
if (left.major !== right.major) return left.major - right.major;
if (left.minor !== right.minor) return left.minor - right.minor;
return left.patch - right.patch;
}
function isSupportedNodeVersion(version = process.versions.node) {
return compareNodeVersions(parseNodeVersion(version), parseNodeVersion(MIN_NODE_VERSION)) >= 0;
}
function getUnsupportedNodeVersionLines(version = process.versions.node) {
const isWindows = process.platform === "win32";
return [
`feynman requires Node.js ${MIN_NODE_VERSION} or later (detected ${version}).`,
isWindows
? "Install a newer Node.js from https://nodejs.org, or use the standalone installer:"
: "Switch to Node 20 with `nvm install 20 && nvm use 20`, or use the standalone installer:",
isWindows
? "irm https://feynman.is/install.ps1 | iex"
: "curl -fsSL https://feynman.is/install | bash",
];
}
if (!isSupportedNodeVersion()) {
for (const line of getUnsupportedNodeVersionLines()) {
console.error(line);
}
process.exit(1);
}

View File

@@ -0,0 +1,8 @@
import { rmSync } from "node:fs";
import { resolve } from "node:path";
const appRoot = resolve(import.meta.dirname, "..");
const releaseDir = resolve(appRoot, "dist", "release");
rmSync(releaseDir, { recursive: true, force: true });
console.log("[feynman] removed dist/release before npm pack/publish");

View File

@@ -0,0 +1,128 @@
param(
[string]$Version = "latest",
[ValidateSet("User", "Repo")]
[string]$Scope = "User",
[string]$TargetDir = ""
)
$ErrorActionPreference = "Stop"
function Normalize-Version {
param([string]$RequestedVersion)
if (-not $RequestedVersion) {
return "latest"
}
switch ($RequestedVersion.ToLowerInvariant()) {
"latest" { return "latest" }
"stable" { return "latest" }
"edge" { throw "The edge channel has been removed. Use the default installer for the latest tagged release or pass an exact version." }
default { return $RequestedVersion.TrimStart("v") }
}
}
function Resolve-LatestReleaseVersion {
$page = Invoke-WebRequest -Uri "https://github.com/getcompanion-ai/feynman/releases/latest"
$match = [regex]::Match($page.Content, 'releases/tag/v([0-9][^"''<>\s]*)')
if (-not $match.Success) {
throw "Failed to resolve the latest Feynman release version."
}
return $match.Groups[1].Value
}
function Resolve-VersionMetadata {
param([string]$RequestedVersion)
$normalizedVersion = Normalize-Version -RequestedVersion $RequestedVersion
if ($normalizedVersion -eq "latest") {
$resolvedVersion = Resolve-LatestReleaseVersion
} else {
$resolvedVersion = $normalizedVersion
}
return [PSCustomObject]@{
ResolvedVersion = $resolvedVersion
GitRef = "v$resolvedVersion"
DownloadUrl = if ($env:FEYNMAN_INSTALL_SKILLS_ARCHIVE_URL) { $env:FEYNMAN_INSTALL_SKILLS_ARCHIVE_URL } else { "https://github.com/getcompanion-ai/feynman/archive/refs/tags/v$resolvedVersion.zip" }
}
}
function Resolve-InstallDir {
param(
[string]$ResolvedScope,
[string]$ResolvedTargetDir
)
if ($ResolvedTargetDir) {
return $ResolvedTargetDir
}
if ($ResolvedScope -eq "Repo") {
return Join-Path (Get-Location) ".agents\skills\feynman"
}
$codexHome = if ($env:CODEX_HOME) { $env:CODEX_HOME } else { Join-Path $HOME ".codex" }
return Join-Path $codexHome "skills\feynman"
}
$metadata = Resolve-VersionMetadata -RequestedVersion $Version
$resolvedVersion = $metadata.ResolvedVersion
$downloadUrl = $metadata.DownloadUrl
$installDir = Resolve-InstallDir -ResolvedScope $Scope -ResolvedTargetDir $TargetDir
$tmpDir = Join-Path ([System.IO.Path]::GetTempPath()) ("feynman-skills-install-" + [System.Guid]::NewGuid().ToString("N"))
New-Item -ItemType Directory -Path $tmpDir | Out-Null
try {
$archivePath = Join-Path $tmpDir "feynman-skills.zip"
$extractDir = Join-Path $tmpDir "extract"
Write-Host "==> Downloading Feynman skills $resolvedVersion"
Invoke-WebRequest -Uri $downloadUrl -OutFile $archivePath
Write-Host "==> Extracting skills"
Expand-Archive -LiteralPath $archivePath -DestinationPath $extractDir -Force
$sourceRoot = Get-ChildItem -Path $extractDir -Directory | Select-Object -First 1
if (-not $sourceRoot) {
throw "Could not find extracted Feynman archive."
}
$skillsSource = Join-Path $sourceRoot.FullName "skills"
$promptsSource = Join-Path $sourceRoot.FullName "prompts"
if (-not (Test-Path $skillsSource) -or -not (Test-Path $promptsSource)) {
throw "Could not find the bundled skills resources in the downloaded archive."
}
$installParent = Split-Path $installDir -Parent
if ($installParent) {
New-Item -ItemType Directory -Path $installParent -Force | Out-Null
}
if (Test-Path $installDir) {
Remove-Item -Recurse -Force $installDir
}
New-Item -ItemType Directory -Path $installDir -Force | Out-Null
Copy-Item -Path (Join-Path $skillsSource "*") -Destination $installDir -Recurse -Force
New-Item -ItemType Directory -Path (Join-Path $installDir "prompts") -Force | Out-Null
Copy-Item -Path (Join-Path $promptsSource "*") -Destination (Join-Path $installDir "prompts") -Recurse -Force
Copy-Item -Path (Join-Path $sourceRoot.FullName "AGENTS.md") -Destination (Join-Path $installDir "AGENTS.md") -Force
Copy-Item -Path (Join-Path $sourceRoot.FullName "CONTRIBUTING.md") -Destination (Join-Path $installDir "CONTRIBUTING.md") -Force
Write-Host "==> Installed skills to $installDir"
if ($Scope -eq "Repo") {
Write-Host "Repo-local skills will be discovered automatically from .agents/skills."
} else {
Write-Host "User-level skills will be discovered from `$CODEX_HOME/skills."
}
Write-Host "Feynman skills $resolvedVersion installed successfully."
} finally {
if (Test-Path $tmpDir) {
Remove-Item -Recurse -Force $tmpDir
}
}

View File

@@ -0,0 +1,210 @@
#!/bin/sh
set -eu
VERSION="latest"
SCOPE="${FEYNMAN_SKILLS_SCOPE:-user}"
TARGET_DIR="${FEYNMAN_SKILLS_DIR:-}"
step() {
printf '==> %s\n' "$1"
}
normalize_version() {
case "$1" in
"")
printf 'latest\n'
;;
latest | stable)
printf 'latest\n'
;;
edge)
echo "The edge channel has been removed. Use the default installer for the latest tagged release or pass an exact version." >&2
exit 1
;;
v*)
printf '%s\n' "${1#v}"
;;
*)
printf '%s\n' "$1"
;;
esac
}
download_file() {
url="$1"
output="$2"
if command -v curl >/dev/null 2>&1; then
if [ -t 2 ]; then
curl -fL --progress-bar "$url" -o "$output"
else
curl -fsSL "$url" -o "$output"
fi
return
fi
if command -v wget >/dev/null 2>&1; then
if [ -t 2 ]; then
wget --show-progress -O "$output" "$url"
else
wget -q -O "$output" "$url"
fi
return
fi
echo "curl or wget is required to install Feynman skills." >&2
exit 1
}
download_text() {
url="$1"
if command -v curl >/dev/null 2>&1; then
curl -fsSL "$url"
return
fi
if command -v wget >/dev/null 2>&1; then
wget -q -O - "$url"
return
fi
echo "curl or wget is required to install Feynman skills." >&2
exit 1
}
resolve_version() {
normalized_version="$(normalize_version "$VERSION")"
if [ "$normalized_version" = "latest" ]; then
release_page="$(download_text "https://github.com/getcompanion-ai/feynman/releases/latest")"
resolved_version="$(printf '%s\n' "$release_page" | sed -n 's@.*releases/tag/v\([0-9][^"<>[:space:]]*\).*@\1@p' | head -n 1)"
if [ -z "$resolved_version" ]; then
echo "Failed to resolve the latest Feynman release version." >&2
exit 1
fi
printf '%s\nv%s\n' "$resolved_version" "$resolved_version"
return
fi
printf '%s\nv%s\n' "$normalized_version" "$normalized_version"
}
resolve_target_dir() {
if [ -n "$TARGET_DIR" ]; then
printf '%s\n' "$TARGET_DIR"
return
fi
case "$SCOPE" in
repo)
printf '%s/.agents/skills/feynman\n' "$PWD"
;;
user)
codex_home="${CODEX_HOME:-$HOME/.codex}"
printf '%s/skills/feynman\n' "$codex_home"
;;
*)
echo "Unknown scope: $SCOPE (expected --user or --repo)" >&2
exit 1
;;
esac
}
while [ $# -gt 0 ]; do
case "$1" in
--repo)
SCOPE="repo"
;;
--user)
SCOPE="user"
;;
--dir)
if [ $# -lt 2 ]; then
echo "Usage: install-skills.sh [stable|latest|<version>] [--user|--repo] [--dir <path>]" >&2
exit 1
fi
TARGET_DIR="$2"
shift
;;
edge|stable|latest|v*|[0-9]*)
VERSION="$1"
;;
*)
echo "Unknown argument: $1" >&2
echo "Usage: install-skills.sh [stable|latest|<version>] [--user|--repo] [--dir <path>]" >&2
exit 1
;;
esac
shift
done
archive_metadata="$(resolve_version)"
resolved_version="$(printf '%s\n' "$archive_metadata" | sed -n '1p')"
git_ref="$(printf '%s\n' "$archive_metadata" | sed -n '2p')"
archive_url="${FEYNMAN_INSTALL_SKILLS_ARCHIVE_URL:-}"
if [ -z "$archive_url" ]; then
case "$git_ref" in
main)
archive_url="https://github.com/getcompanion-ai/feynman/archive/refs/heads/main.tar.gz"
;;
v*)
archive_url="https://github.com/getcompanion-ai/feynman/archive/refs/tags/${git_ref}.tar.gz"
;;
esac
fi
if [ -z "$archive_url" ]; then
echo "Could not resolve a download URL for ref: $git_ref" >&2
exit 1
fi
install_dir="$(resolve_target_dir)"
step "Installing Feynman skills ${resolved_version} (${SCOPE})"
tmp_dir="$(mktemp -d)"
cleanup() {
rm -rf "$tmp_dir"
}
trap cleanup EXIT INT TERM
archive_path="$tmp_dir/feynman-skills.tar.gz"
step "Downloading skills archive"
download_file "$archive_url" "$archive_path"
extract_dir="$tmp_dir/extract"
mkdir -p "$extract_dir"
step "Extracting skills"
tar -xzf "$archive_path" -C "$extract_dir"
source_root="$(find "$extract_dir" -mindepth 1 -maxdepth 1 -type d | head -n 1)"
if [ -z "$source_root" ] || [ ! -d "$source_root/skills" ] || [ ! -d "$source_root/prompts" ]; then
echo "Could not find the bundled skills resources in the downloaded archive." >&2
exit 1
fi
mkdir -p "$(dirname "$install_dir")"
rm -rf "$install_dir"
mkdir -p "$install_dir"
cp -R "$source_root/skills/." "$install_dir/"
mkdir -p "$install_dir/prompts"
cp -R "$source_root/prompts/." "$install_dir/prompts/"
cp "$source_root/AGENTS.md" "$install_dir/AGENTS.md"
cp "$source_root/CONTRIBUTING.md" "$install_dir/CONTRIBUTING.md"
step "Installed skills to $install_dir"
case "$SCOPE" in
repo)
step "Repo-local skills will be discovered automatically from .agents/skills"
;;
user)
step "User-level skills will be discovered from \$CODEX_HOME/skills"
;;
esac
printf 'Feynman skills %s installed successfully.\n' "$resolved_version"

View File

@@ -1,5 +1,5 @@
param( param(
[string]$Version = "edge" [string]$Version = "latest"
) )
$ErrorActionPreference = "Stop" $ErrorActionPreference = "Stop"
@@ -8,17 +8,27 @@ function Normalize-Version {
param([string]$RequestedVersion) param([string]$RequestedVersion)
if (-not $RequestedVersion) { if (-not $RequestedVersion) {
return "edge" return "latest"
} }
switch ($RequestedVersion.ToLowerInvariant()) { switch ($RequestedVersion.ToLowerInvariant()) {
"edge" { return "edge" }
"latest" { return "latest" } "latest" { return "latest" }
"stable" { return "latest" } "stable" { return "latest" }
"edge" { throw "The edge channel has been removed. Use the default installer for the latest tagged release or pass an exact version." }
default { return $RequestedVersion.TrimStart("v") } default { return $RequestedVersion.TrimStart("v") }
} }
} }
function Resolve-LatestReleaseVersion {
$page = Invoke-WebRequest -Uri "https://github.com/getcompanion-ai/feynman/releases/latest"
$match = [regex]::Match($page.Content, 'releases/tag/v([0-9][^"''<>\s]*)')
if (-not $match.Success) {
throw "Failed to resolve the latest Feynman release version."
}
return $match.Groups[1].Value
}
function Resolve-ReleaseMetadata { function Resolve-ReleaseMetadata {
param( param(
[string]$RequestedVersion, [string]$RequestedVersion,
@@ -28,34 +38,8 @@ function Resolve-ReleaseMetadata {
$normalizedVersion = Normalize-Version -RequestedVersion $RequestedVersion $normalizedVersion = Normalize-Version -RequestedVersion $RequestedVersion
if ($normalizedVersion -eq "edge") {
$release = Invoke-RestMethod -Uri "https://api.github.com/repos/getcompanion-ai/feynman/releases/tags/edge"
$asset = $release.assets | Where-Object { $_.name -like "feynman-*-$AssetTarget.$BundleExtension" } | Select-Object -First 1
if (-not $asset) {
throw "Failed to resolve the latest Feynman edge bundle."
}
$archiveName = $asset.name
$suffix = ".$BundleExtension"
$bundleName = $archiveName.Substring(0, $archiveName.Length - $suffix.Length)
$resolvedVersion = $bundleName.Substring("feynman-".Length)
$resolvedVersion = $resolvedVersion.Substring(0, $resolvedVersion.Length - ("-$AssetTarget").Length)
return [PSCustomObject]@{
ResolvedVersion = $resolvedVersion
BundleName = $bundleName
ArchiveName = $archiveName
DownloadUrl = $asset.browser_download_url
}
}
if ($normalizedVersion -eq "latest") { if ($normalizedVersion -eq "latest") {
$release = Invoke-RestMethod -Uri "https://api.github.com/repos/getcompanion-ai/feynman/releases/latest" $resolvedVersion = Resolve-LatestReleaseVersion
if (-not $release.tag_name) {
throw "Failed to resolve the latest Feynman release version."
}
$resolvedVersion = $release.tag_name.TrimStart("v")
} else { } else {
$resolvedVersion = $normalizedVersion $resolvedVersion = $normalizedVersion
} }
@@ -73,12 +57,26 @@ function Resolve-ReleaseMetadata {
} }
function Get-ArchSuffix { function Get-ArchSuffix {
$arch = [System.Runtime.InteropServices.RuntimeInformation]::OSArchitecture # Prefer PROCESSOR_ARCHITECTURE which is always available on Windows.
switch ($arch.ToString()) { # RuntimeInformation::OSArchitecture requires .NET 4.7.1+ and may not
"X64" { return "x64" } # be loaded in every Windows PowerShell 5.1 session.
"Arm64" { return "arm64" } $envArch = $env:PROCESSOR_ARCHITECTURE
default { throw "Unsupported architecture: $arch" } if ($envArch) {
switch ($envArch) {
"AMD64" { return "x64" }
"ARM64" { return "arm64" }
}
} }
try {
$arch = [System.Runtime.InteropServices.RuntimeInformation]::OSArchitecture
switch ($arch.ToString()) {
"X64" { return "x64" }
"Arm64" { return "arm64" }
}
} catch {}
throw "Unsupported architecture: $envArch"
} }
$archSuffix = Get-ArchSuffix $archSuffix = Get-ArchSuffix
@@ -111,8 +109,8 @@ This usually means the release exists, but not all platform bundles were uploade
Workarounds: Workarounds:
- try again after the release finishes publishing - try again after the release finishes publishing
- install via pnpm instead: pnpm add -g @companion-ai/feynman - pass the latest published version explicitly, e.g.:
- install via bun instead: bun add -g @companion-ai/feynman & ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.16
"@ "@
} }
@@ -127,14 +125,24 @@ Workarounds:
New-Item -ItemType Directory -Path $installBinDir -Force | Out-Null New-Item -ItemType Directory -Path $installBinDir -Force | Out-Null
$shimPath = Join-Path $installBinDir "feynman.cmd" $shimPath = Join-Path $installBinDir "feynman.cmd"
$shimPs1Path = Join-Path $installBinDir "feynman.ps1"
Write-Host "==> Linking feynman into $installBinDir" Write-Host "==> Linking feynman into $installBinDir"
@" @"
@echo off @echo off
"$bundleDir\feynman.cmd" %* CALL "$bundleDir\feynman.cmd" %*
"@ | Set-Content -Path $shimPath -Encoding ASCII "@ | Set-Content -Path $shimPath -Encoding ASCII
@"
`$BundleDir = "$bundleDir"
& "`$BundleDir\node\node.exe" "`$BundleDir\app\bin\feynman.js" @args
"@ | Set-Content -Path $shimPs1Path -Encoding UTF8
$currentUserPath = [Environment]::GetEnvironmentVariable("Path", "User") $currentUserPath = [Environment]::GetEnvironmentVariable("Path", "User")
if (-not $currentUserPath.Split(';').Contains($installBinDir)) { $alreadyOnPath = $false
if ($currentUserPath) {
$alreadyOnPath = $currentUserPath.Split(';') -contains $installBinDir
}
if (-not $alreadyOnPath) {
$updatedPath = if ([string]::IsNullOrWhiteSpace($currentUserPath)) { $updatedPath = if ([string]::IsNullOrWhiteSpace($currentUserPath)) {
$installBinDir $installBinDir
} else { } else {
@@ -146,6 +154,14 @@ Workarounds:
Write-Host "$installBinDir is already on PATH." Write-Host "$installBinDir is already on PATH."
} }
$resolvedCommand = Get-Command feynman -ErrorAction SilentlyContinue
if ($resolvedCommand -and $resolvedCommand.Source -ne $shimPath) {
Write-Warning "Current shell resolves feynman to $($resolvedCommand.Source)"
Write-Host "Run in a new shell, or run: `$env:Path = '$installBinDir;' + `$env:Path"
Write-Host "Then run: feynman"
Write-Host "If that path is an old package-manager install, remove it or put $installBinDir first on PATH."
}
Write-Host "Feynman $resolvedVersion installed successfully." Write-Host "Feynman $resolvedVersion installed successfully."
} finally { } finally {
if (Test-Path $tmpDir) { if (Test-Path $tmpDir) {

View File

@@ -2,7 +2,7 @@
set -eu set -eu
VERSION="${1:-edge}" VERSION="${1:-latest}"
INSTALL_BIN_DIR="${FEYNMAN_INSTALL_BIN_DIR:-$HOME/.local/bin}" INSTALL_BIN_DIR="${FEYNMAN_INSTALL_BIN_DIR:-$HOME/.local/bin}"
INSTALL_APP_DIR="${FEYNMAN_INSTALL_APP_DIR:-$HOME/.local/share/feynman}" INSTALL_APP_DIR="${FEYNMAN_INSTALL_APP_DIR:-$HOME/.local/share/feynman}"
SKIP_PATH_UPDATE="${FEYNMAN_INSTALL_SKIP_PATH_UPDATE:-0}" SKIP_PATH_UPDATE="${FEYNMAN_INSTALL_SKIP_PATH_UPDATE:-0}"
@@ -54,12 +54,16 @@ run_with_spinner() {
normalize_version() { normalize_version() {
case "$1" in case "$1" in
"" | edge) "")
printf 'edge\n' printf 'latest\n'
;; ;;
latest | stable) latest | stable)
printf 'latest\n' printf 'latest\n'
;; ;;
edge)
echo "The edge channel has been removed. Use the default installer for the latest tagged release or pass an exact version." >&2
exit 1
;;
v*) v*)
printf '%s\n' "${1#v}" printf '%s\n' "${1#v}"
;; ;;
@@ -160,39 +164,29 @@ require_command() {
fi fi
} }
resolve_release_metadata() { warn_command_conflict() {
normalized_version="$(normalize_version "$VERSION")" expected_path="$INSTALL_BIN_DIR/feynman"
resolved_path="$(command -v feynman 2>/dev/null || true)"
if [ "$normalized_version" = "edge" ]; then if [ -z "$resolved_path" ]; then
release_json="$(download_text "https://api.github.com/repos/getcompanion-ai/feynman/releases/tags/edge")"
asset_url=""
for candidate in $(printf '%s\n' "$release_json" | sed -n 's/.*"browser_download_url":[[:space:]]*"\([^"]*\)".*/\1/p'); do
case "$candidate" in
*/feynman-*-${asset_target}.${archive_extension})
asset_url="$candidate"
break
;;
esac
done
if [ -z "$asset_url" ]; then
echo "Failed to resolve the latest Feynman edge bundle." >&2
exit 1
fi
archive_name="${asset_url##*/}"
bundle_name="${archive_name%.$archive_extension}"
resolved_version="${bundle_name#feynman-}"
resolved_version="${resolved_version%-${asset_target}}"
printf '%s\n%s\n%s\n%s\n' "$resolved_version" "$bundle_name" "$archive_name" "$asset_url"
return return
fi fi
if [ "$resolved_path" != "$expected_path" ]; then
step "Warning: current shell resolves feynman to $resolved_path"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && hash -r && feynman"
step "Or launch directly: $expected_path"
step "If that path is an old package-manager install, remove it or put $INSTALL_BIN_DIR first on PATH."
fi
}
resolve_release_metadata() {
normalized_version="$(normalize_version "$VERSION")"
if [ "$normalized_version" = "latest" ]; then if [ "$normalized_version" = "latest" ]; then
release_json="$(download_text "https://api.github.com/repos/getcompanion-ai/feynman/releases/latest")" release_page="$(download_text "https://github.com/getcompanion-ai/feynman/releases/latest")"
resolved_version="$(printf '%s\n' "$release_json" | sed -n 's/.*"tag_name":[[:space:]]*"v\([^"]*\)".*/\1/p' | head -n 1)" resolved_version="$(printf '%s\n' "$release_page" | sed -n 's@.*releases/tag/v\([0-9][^"<>[:space:]]*\).*@\1@p' | head -n 1)"
if [ -z "$resolved_version" ]; then if [ -z "$resolved_version" ]; then
echo "Failed to resolve the latest Feynman release version." >&2 echo "Failed to resolve the latest Feynman release version." >&2
@@ -266,8 +260,8 @@ This usually means the release exists, but not all platform bundles were uploade
Workarounds: Workarounds:
- try again after the release finishes publishing - try again after the release finishes publishing
- install via pnpm instead: pnpm add -g @companion-ai/feynman - pass the latest published version explicitly, e.g.:
- install via bun instead: bun add -g @companion-ai/feynman curl -fsSL https://feynman.is/install | bash -s -- 0.2.16
EOF EOF
exit 1 exit 1
fi fi
@@ -290,20 +284,22 @@ add_to_path
case "$path_action" in case "$path_action" in
added) added)
step "PATH updated for future shells in $path_profile" step "PATH updated for future shells in $path_profile"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && feynman" step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && hash -r && feynman"
;; ;;
configured) configured)
step "PATH is already configured for future shells in $path_profile" step "PATH is already configured for future shells in $path_profile"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && feynman" step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && hash -r && feynman"
;; ;;
skipped) skipped)
step "PATH update skipped" step "PATH update skipped"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && feynman" step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && hash -r && feynman"
;; ;;
*) *)
step "$INSTALL_BIN_DIR is already on PATH" step "$INSTALL_BIN_DIR is already on PATH"
step "Run: feynman" step "Run: hash -r && feynman"
;; ;;
esac esac
warn_command_conflict
printf 'Feynman %s installed successfully.\n' "$resolved_version" printf 'Feynman %s installed successfully.\n' "$resolved_version"

View File

@@ -0,0 +1 @@
export function patchPiExtensionLoaderSource(source: string): string;

View File

@@ -0,0 +1,32 @@
const PATH_TO_FILE_URL_IMPORT = 'import { fileURLToPath, pathToFileURL } from "node:url";';
const FILE_URL_TO_PATH_IMPORT = 'import { fileURLToPath } from "node:url";';
const IMPORT_CALL = 'const module = await jiti.import(extensionPath, { default: true });';
const PATCHED_IMPORT_CALL = [
' const extensionSpecifier = process.platform === "win32" && path.isAbsolute(extensionPath)',
' ? pathToFileURL(extensionPath).href',
' : extensionPath;',
' const module = await jiti.import(extensionSpecifier, { default: true });',
].join("\n");
export function patchPiExtensionLoaderSource(source) {
let patched = source;
if (patched.includes(PATH_TO_FILE_URL_IMPORT) || patched.includes(PATCHED_IMPORT_CALL)) {
return patched;
}
if (patched.includes(FILE_URL_TO_PATH_IMPORT)) {
patched = patched.replace(FILE_URL_TO_PATH_IMPORT, PATH_TO_FILE_URL_IMPORT);
}
if (!patched.includes(PATH_TO_FILE_URL_IMPORT)) {
return source;
}
if (!patched.includes(IMPORT_CALL)) {
return source;
}
return patched.replace(IMPORT_CALL, PATCHED_IMPORT_CALL);
}

View File

@@ -0,0 +1 @@
export function patchPiGoogleLegacySchemaSource(source: string): string;

View File

@@ -0,0 +1,44 @@
const HELPER = [
"function normalizeLegacyToolSchema(schema) {",
" if (Array.isArray(schema)) return schema.map((item) => normalizeLegacyToolSchema(item));",
' if (!schema || typeof schema !== "object") return schema;',
" const normalized = {};",
" for (const [key, value] of Object.entries(schema)) {",
' if (key === "const") {',
" normalized.enum = [value];",
" continue;",
" }",
" normalized[key] = normalizeLegacyToolSchema(value);",
" }",
" return normalized;",
"}",
].join("\n");
const ORIGINAL =
' ...(useParameters ? { parameters: tool.parameters } : { parametersJsonSchema: tool.parameters }),';
const PATCHED = [
" ...(useParameters",
" ? { parameters: normalizeLegacyToolSchema(tool.parameters) }",
" : { parametersJsonSchema: tool.parameters }),",
].join("\n");
export function patchPiGoogleLegacySchemaSource(source) {
let patched = source;
if (patched.includes("function normalizeLegacyToolSchema(schema) {")) {
return patched;
}
if (!patched.includes(ORIGINAL)) {
return source;
}
patched = patched.replace(ORIGINAL, PATCHED);
const marker = "export function convertTools(tools, useParameters = false) {";
const markerIndex = patched.indexOf(marker);
if (markerIndex === -1) {
return source;
}
return `${patched.slice(0, markerIndex)}${HELPER}\n\n${patched.slice(markerIndex)}`;
}

View File

@@ -0,0 +1,2 @@
export const PI_SUBAGENTS_PATCH_TARGETS: string[];
export function patchPiSubagentsSource(relativePath: string, source: string): string;

View File

@@ -0,0 +1,124 @@
export const PI_SUBAGENTS_PATCH_TARGETS = [
"index.ts",
"agents.ts",
"artifacts.ts",
"run-history.ts",
"skills.ts",
"chain-clarify.ts",
];
const RESOLVE_PI_AGENT_DIR_HELPER = [
"function resolvePiAgentDir(): string {",
' const configured = process.env.PI_CODING_AGENT_DIR?.trim();',
' if (!configured) return path.join(os.homedir(), ".pi", "agent");',
' return configured.startsWith("~/") ? path.join(os.homedir(), configured.slice(2)) : configured;',
"}",
].join("\n");
function injectResolvePiAgentDirHelper(source) {
if (source.includes("function resolvePiAgentDir(): string {")) {
return source;
}
const lines = source.split("\n");
let insertAt = 0;
let importSeen = false;
let importOpen = false;
for (let index = 0; index < lines.length; index += 1) {
const trimmed = lines[index].trim();
if (!importSeen) {
if (trimmed === "" || trimmed.startsWith("/**") || trimmed.startsWith("*") || trimmed.startsWith("*/")) {
insertAt = index + 1;
continue;
}
if (trimmed.startsWith("import ")) {
importSeen = true;
importOpen = !trimmed.endsWith(";");
insertAt = index + 1;
continue;
}
break;
}
if (trimmed.startsWith("import ")) {
importOpen = !trimmed.endsWith(";");
insertAt = index + 1;
continue;
}
if (importOpen) {
if (trimmed.endsWith(";")) importOpen = false;
insertAt = index + 1;
continue;
}
if (trimmed === "") {
insertAt = index + 1;
continue;
}
insertAt = index;
break;
}
return [...lines.slice(0, insertAt), "", RESOLVE_PI_AGENT_DIR_HELPER, "", ...lines.slice(insertAt)].join("\n");
}
function replaceAll(source, from, to) {
return source.split(from).join(to);
}
export function patchPiSubagentsSource(relativePath, source) {
let patched = source;
switch (relativePath) {
case "index.ts":
patched = replaceAll(
patched,
'const configPath = path.join(os.homedir(), ".pi", "agent", "extensions", "subagent", "config.json");',
'const configPath = path.join(resolvePiAgentDir(), "extensions", "subagent", "config.json");',
);
break;
case "agents.ts":
patched = replaceAll(
patched,
'const userDir = path.join(os.homedir(), ".pi", "agent", "agents");',
'const userDir = path.join(resolvePiAgentDir(), "agents");',
);
break;
case "artifacts.ts":
patched = replaceAll(
patched,
'const sessionsBase = path.join(os.homedir(), ".pi", "agent", "sessions");',
'const sessionsBase = path.join(resolvePiAgentDir(), "sessions");',
);
break;
case "run-history.ts":
patched = replaceAll(
patched,
'const HISTORY_PATH = path.join(os.homedir(), ".pi", "agent", "run-history.jsonl");',
'const HISTORY_PATH = path.join(resolvePiAgentDir(), "run-history.jsonl");',
);
break;
case "skills.ts":
patched = replaceAll(
patched,
'const AGENT_DIR = path.join(os.homedir(), ".pi", "agent");',
"const AGENT_DIR = resolvePiAgentDir();",
);
break;
case "chain-clarify.ts":
patched = replaceAll(
patched,
'const dir = path.join(os.homedir(), ".pi", "agent", "agents");',
'const dir = path.join(resolvePiAgentDir(), "agents");',
);
break;
default:
return source;
}
if (patched === source) {
return source;
}
return injectResolvePiAgentDirHelper(patched);
}

View File

@@ -0,0 +1,2 @@
export const PI_WEB_ACCESS_PATCH_TARGETS: string[];
export function patchPiWebAccessSource(relativePath: string, source: string): string;

View File

@@ -0,0 +1,32 @@
export const PI_WEB_ACCESS_PATCH_TARGETS = [
"index.ts",
"exa.ts",
"gemini-api.ts",
"gemini-search.ts",
"gemini-web.ts",
"github-extract.ts",
"perplexity.ts",
"video-extract.ts",
"youtube-extract.ts",
];
const LEGACY_CONFIG_EXPR = 'join(homedir(), ".pi", "web-search.json")';
const PATCHED_CONFIG_EXPR =
'process.env.FEYNMAN_WEB_SEARCH_CONFIG ?? process.env.PI_WEB_SEARCH_CONFIG ?? join(homedir(), ".pi", "web-search.json")';
export function patchPiWebAccessSource(relativePath, source) {
let patched = source;
if (patched.includes(PATCHED_CONFIG_EXPR)) {
return patched;
}
patched = patched.split(LEGACY_CONFIG_EXPR).join(PATCHED_CONFIG_EXPR);
if (relativePath === "index.ts" && patched !== source) {
patched = patched.replace('import { join } from "node:path";', 'import { dirname, join } from "node:path";');
patched = patched.replace('const dir = join(homedir(), ".pi");', "const dir = dirname(WEB_SEARCH_CONFIG_PATH);");
}
return patched;
}

View File

@@ -1,12 +1,22 @@
import { spawnSync } from "node:child_process"; import { spawnSync } from "node:child_process";
import { existsSync, mkdirSync, readFileSync, rmSync, writeFileSync } from "node:fs"; import { existsSync, mkdirSync, readFileSync, rmSync, writeFileSync } from "node:fs";
import { createRequire } from "node:module"; import { createRequire } from "node:module";
import { homedir } from "node:os";
import { dirname, resolve } from "node:path"; import { dirname, resolve } from "node:path";
import { fileURLToPath } from "node:url"; import { fileURLToPath } from "node:url";
import { FEYNMAN_LOGO_HTML } from "../logo.mjs"; import { FEYNMAN_LOGO_HTML } from "../logo.mjs";
import { patchPiExtensionLoaderSource } from "./lib/pi-extension-loader-patch.mjs";
import { patchPiGoogleLegacySchemaSource } from "./lib/pi-google-legacy-schema-patch.mjs";
import { PI_WEB_ACCESS_PATCH_TARGETS, patchPiWebAccessSource } from "./lib/pi-web-access-patch.mjs";
import { PI_SUBAGENTS_PATCH_TARGETS, patchPiSubagentsSource } from "./lib/pi-subagents-patch.mjs";
const here = dirname(fileURLToPath(import.meta.url)); const here = dirname(fileURLToPath(import.meta.url));
const appRoot = resolve(here, ".."); const appRoot = resolve(here, "..");
const feynmanHome = resolve(process.env.FEYNMAN_HOME ?? homedir(), ".feynman");
const feynmanNpmPrefix = resolve(feynmanHome, "npm-global");
process.env.FEYNMAN_NPM_PREFIX = feynmanNpmPrefix;
process.env.NPM_CONFIG_PREFIX = feynmanNpmPrefix;
process.env.npm_config_prefix = feynmanNpmPrefix;
const appRequire = createRequire(resolve(appRoot, "package.json")); const appRequire = createRequire(resolve(appRoot, "package.json"));
const isGlobalInstall = process.env.npm_config_global === "true" || process.env.npm_config_location === "global"; const isGlobalInstall = process.env.npm_config_global === "true" || process.env.npm_config_location === "global";
@@ -51,8 +61,20 @@ const cliPath = piPackageRoot ? resolve(piPackageRoot, "dist", "cli.js") : null;
const bunCliPath = piPackageRoot ? resolve(piPackageRoot, "dist", "bun", "cli.js") : null; const bunCliPath = piPackageRoot ? resolve(piPackageRoot, "dist", "bun", "cli.js") : null;
const interactiveModePath = piPackageRoot ? resolve(piPackageRoot, "dist", "modes", "interactive", "interactive-mode.js") : null; const interactiveModePath = piPackageRoot ? resolve(piPackageRoot, "dist", "modes", "interactive", "interactive-mode.js") : null;
const interactiveThemePath = piPackageRoot ? resolve(piPackageRoot, "dist", "modes", "interactive", "theme", "theme.js") : null; const interactiveThemePath = piPackageRoot ? resolve(piPackageRoot, "dist", "modes", "interactive", "theme", "theme.js") : null;
const extensionLoaderPath = piPackageRoot ? resolve(piPackageRoot, "dist", "core", "extensions", "loader.js") : null;
const terminalPath = piTuiRoot ? resolve(piTuiRoot, "dist", "terminal.js") : null;
const editorPath = piTuiRoot ? resolve(piTuiRoot, "dist", "components", "editor.js") : null; const editorPath = piTuiRoot ? resolve(piTuiRoot, "dist", "components", "editor.js") : null;
const workspaceRoot = resolve(appRoot, ".feynman", "npm", "node_modules"); const workspaceRoot = resolve(appRoot, ".feynman", "npm", "node_modules");
const workspaceExtensionLoaderPath = resolve(
workspaceRoot,
"@mariozechner",
"pi-coding-agent",
"dist",
"core",
"extensions",
"loader.js",
);
const piSubagentsRoot = resolve(workspaceRoot, "pi-subagents");
const webAccessPath = resolve(workspaceRoot, "pi-web-access", "index.ts"); const webAccessPath = resolve(workspaceRoot, "pi-web-access", "index.ts");
const sessionSearchIndexerPath = resolve( const sessionSearchIndexerPath = resolve(
workspaceRoot, workspaceRoot,
@@ -70,7 +92,17 @@ const workspaceArchivePath = resolve(appRoot, ".feynman", "runtime-workspace.tgz
function createInstallCommand(packageManager, packageSpecs) { function createInstallCommand(packageManager, packageSpecs) {
switch (packageManager) { switch (packageManager) {
case "npm": case "npm":
return ["install", "--prefer-offline", "--no-audit", "--no-fund", "--loglevel", "error", ...packageSpecs]; return [
"install",
"--global=false",
"--location=project",
"--prefer-offline",
"--no-audit",
"--no-fund",
"--loglevel",
"error",
...packageSpecs,
];
case "pnpm": case "pnpm":
return ["add", "--prefer-offline", "--reporter", "silent", ...packageSpecs]; return ["add", "--prefer-offline", "--reporter", "silent", ...packageSpecs];
case "bun": case "bun":
@@ -138,12 +170,18 @@ function restorePackagedWorkspace(packageSpecs) {
timeout: 300000, timeout: 300000,
}); });
// On Windows, tar may exit non-zero due to symlink creation failures in
// .bin/ directories. These are non-fatal — check whether the actual
// package directories were extracted successfully.
const packagesPresent = packageSpecs.every((spec) => existsSync(resolve(workspaceRoot, parsePackageName(spec))));
if (packagesPresent) return true;
if (result.status !== 0) { if (result.status !== 0) {
if (result.stderr?.length) process.stderr.write(result.stderr); if (result.stderr?.length) process.stderr.write(result.stderr);
return false; return false;
} }
return packageSpecs.every((spec) => existsSync(resolve(workspaceRoot, parsePackageName(spec)))); return false;
} }
function refreshPackagedWorkspace(packageSpecs) { function refreshPackagedWorkspace(packageSpecs) {
@@ -155,12 +193,18 @@ function resolveExecutable(name, fallbackPaths = []) {
if (existsSync(candidate)) return candidate; if (existsSync(candidate)) return candidate;
} }
const result = spawnSync("sh", ["-lc", `command -v ${name}`], { const isWindows = process.platform === "win32";
encoding: "utf8", const result = isWindows
stdio: ["ignore", "pipe", "ignore"], ? spawnSync("cmd", ["/c", `where ${name}`], {
}); encoding: "utf8",
stdio: ["ignore", "pipe", "ignore"],
})
: spawnSync("sh", ["-lc", `command -v ${name}`], {
encoding: "utf8",
stdio: ["ignore", "pipe", "ignore"],
});
if (result.status === 0) { if (result.status === 0) {
const resolved = result.stdout.trim(); const resolved = result.stdout.trim().split(/\r?\n/)[0];
if (resolved) return resolved; if (resolved) return resolved;
} }
return null; return null;
@@ -230,6 +274,19 @@ function ensurePandoc() {
ensurePandoc(); ensurePandoc();
if (existsSync(piSubagentsRoot)) {
for (const relativePath of PI_SUBAGENTS_PATCH_TARGETS) {
const entryPath = resolve(piSubagentsRoot, relativePath);
if (!existsSync(entryPath)) continue;
const source = readFileSync(entryPath, "utf8");
const patched = patchPiSubagentsSource(relativePath, source);
if (patched !== source) {
writeFileSync(entryPath, patched, "utf8");
}
}
}
if (packageJsonPath && existsSync(packageJsonPath)) { if (packageJsonPath && existsSync(packageJsonPath)) {
const pkg = JSON.parse(readFileSync(packageJsonPath, "utf8")); const pkg = JSON.parse(readFileSync(packageJsonPath, "utf8"));
if (pkg.piConfig?.name !== "feynman" || pkg.piConfig?.configDir !== ".feynman") { if (pkg.piConfig?.name !== "feynman" || pkg.piConfig?.configDir !== ".feynman") {
@@ -247,10 +304,68 @@ for (const entryPath of [cliPath, bunCliPath].filter(Boolean)) {
continue; continue;
} }
const cliSource = readFileSync(entryPath, "utf8"); let cliSource = readFileSync(entryPath, "utf8");
if (cliSource.includes('process.title = "pi";')) { if (cliSource.includes('process.title = "pi";')) {
writeFileSync(entryPath, cliSource.replace('process.title = "pi";', 'process.title = "feynman";'), "utf8"); cliSource = cliSource.replace('process.title = "pi";', 'process.title = "feynman";');
} }
const stdinErrorGuard = [
"const feynmanHandleStdinError = (error) => {",
' if (error && typeof error === "object") {',
' const code = "code" in error ? error.code : undefined;',
' const syscall = "syscall" in error ? error.syscall : undefined;',
' if ((code === "EIO" || code === "EBADF") && syscall === "read") {',
" return;",
" }",
" }",
"};",
'process.stdin?.on?.("error", feynmanHandleStdinError);',
].join("\n");
if (!cliSource.includes('process.stdin?.on?.("error", feynmanHandleStdinError);')) {
cliSource = cliSource.replace(
'process.emitWarning = (() => { });',
`process.emitWarning = (() => { });\n${stdinErrorGuard}`,
);
}
writeFileSync(entryPath, cliSource, "utf8");
}
if (terminalPath && existsSync(terminalPath)) {
let terminalSource = readFileSync(terminalPath, "utf8");
if (!terminalSource.includes("stdinErrorHandler;")) {
terminalSource = terminalSource.replace(
" stdinBuffer;\n stdinDataHandler;\n",
[
" stdinBuffer;",
" stdinDataHandler;",
" stdinErrorHandler = (error) => {",
' if ((error?.code === "EIO" || error?.code === "EBADF") && error?.syscall === "read") {',
" return;",
" }",
" };",
].join("\n") + "\n",
);
}
if (!terminalSource.includes('process.stdin.on("error", this.stdinErrorHandler);')) {
terminalSource = terminalSource.replace(
' process.stdin.resume();\n',
' process.stdin.resume();\n process.stdin.on("error", this.stdinErrorHandler);\n',
);
}
if (!terminalSource.includes(' process.stdin.removeListener("error", this.stdinErrorHandler);')) {
terminalSource = terminalSource.replace(
' process.stdin.removeListener("data", onData);\n this.inputHandler = previousHandler;\n',
[
' process.stdin.removeListener("data", onData);',
' process.stdin.removeListener("error", this.stdinErrorHandler);',
' this.inputHandler = previousHandler;',
].join("\n"),
);
terminalSource = terminalSource.replace(
' process.stdin.pause();\n',
' process.stdin.removeListener("error", this.stdinErrorHandler);\n process.stdin.pause();\n',
);
}
writeFileSync(terminalPath, terminalSource, "utf8");
} }
if (interactiveModePath && existsSync(interactiveModePath)) { if (interactiveModePath && existsSync(interactiveModePath)) {
@@ -266,6 +381,18 @@ if (interactiveModePath && existsSync(interactiveModePath)) {
} }
} }
for (const loaderPath of [extensionLoaderPath, workspaceExtensionLoaderPath].filter(Boolean)) {
if (!existsSync(loaderPath)) {
continue;
}
const source = readFileSync(loaderPath, "utf8");
const patched = patchPiExtensionLoaderSource(source);
if (patched !== source) {
writeFileSync(loaderPath, patched, "utf8");
}
}
if (interactiveThemePath && existsSync(interactiveThemePath)) { if (interactiveThemePath && existsSync(interactiveThemePath)) {
let themeSource = readFileSync(interactiveThemePath, "utf8"); let themeSource = readFileSync(interactiveThemePath, "utf8");
const desiredGetEditorTheme = [ const desiredGetEditorTheme = [
@@ -441,6 +568,21 @@ if (existsSync(webAccessPath)) {
} }
} }
const piWebAccessRoot = resolve(workspaceRoot, "pi-web-access");
if (existsSync(piWebAccessRoot)) {
for (const relativePath of PI_WEB_ACCESS_PATCH_TARGETS) {
const entryPath = resolve(piWebAccessRoot, relativePath);
if (!existsSync(entryPath)) continue;
const source = readFileSync(entryPath, "utf8");
const patched = patchPiWebAccessSource(relativePath, source);
if (patched !== source) {
writeFileSync(entryPath, patched, "utf8");
}
}
}
if (existsSync(sessionSearchIndexerPath)) { if (existsSync(sessionSearchIndexerPath)) {
const source = readFileSync(sessionSearchIndexerPath, "utf8"); const source = readFileSync(sessionSearchIndexerPath, "utf8");
const original = 'const sessionsDir = path.join(os.homedir(), ".pi", "agent", "sessions");'; const original = 'const sessionsDir = path.join(os.homedir(), ".pi", "agent", "sessions");';
@@ -452,6 +594,7 @@ if (existsSync(sessionSearchIndexerPath)) {
} }
const oauthPagePath = piAiRoot ? resolve(piAiRoot, "dist", "utils", "oauth", "oauth-page.js") : null; const oauthPagePath = piAiRoot ? resolve(piAiRoot, "dist", "utils", "oauth", "oauth-page.js") : null;
const googleSharedPath = piAiRoot ? resolve(piAiRoot, "dist", "providers", "google-shared.js") : null;
if (oauthPagePath && existsSync(oauthPagePath)) { if (oauthPagePath && existsSync(oauthPagePath)) {
let source = readFileSync(oauthPagePath, "utf8"); let source = readFileSync(oauthPagePath, "utf8");
@@ -464,6 +607,14 @@ if (oauthPagePath && existsSync(oauthPagePath)) {
if (changed) writeFileSync(oauthPagePath, source, "utf8"); if (changed) writeFileSync(oauthPagePath, source, "utf8");
} }
if (googleSharedPath && existsSync(googleSharedPath)) {
const source = readFileSync(googleSharedPath, "utf8");
const patched = patchPiGoogleLegacySchemaSource(source);
if (patched !== source) {
writeFileSync(googleSharedPath, patched, "utf8");
}
}
const alphaHubAuthPath = findPackageRoot("@companion-ai/alpha-hub") const alphaHubAuthPath = findPackageRoot("@companion-ai/alpha-hub")
? resolve(findPackageRoot("@companion-ai/alpha-hub"), "src", "lib", "auth.js") ? resolve(findPackageRoot("@companion-ai/alpha-hub"), "src", "lib", "auth.js")
: null; : null;
@@ -482,6 +633,11 @@ if (alphaHubAuthPath && existsSync(alphaHubAuthPath)) {
if (source.includes(oldError)) { if (source.includes(oldError)) {
source = source.replace(oldError, newError); source = source.replace(oldError, newError);
} }
const brokenWinOpen = "else if (plat === 'win32') execSync(`start \"${url}\"`);";
const fixedWinOpen = "else if (plat === 'win32') execSync(`cmd /c start \"\" \"${url}\"`);";
if (source.includes(brokenWinOpen)) {
source = source.replace(brokenWinOpen, fixedWinOpen);
}
writeFileSync(alphaHubAuthPath, source, "utf8"); writeFileSync(alphaHubAuthPath, source, "utf8");
} }

View File

@@ -7,5 +7,7 @@ const websitePublicDir = resolve(appRoot, "website", "public");
mkdirSync(websitePublicDir, { recursive: true }); mkdirSync(websitePublicDir, { recursive: true });
cpSync(resolve(appRoot, "scripts", "install", "install.sh"), resolve(websitePublicDir, "install")); cpSync(resolve(appRoot, "scripts", "install", "install.sh"), resolve(websitePublicDir, "install"));
cpSync(resolve(appRoot, "scripts", "install", "install.ps1"), resolve(websitePublicDir, "install.ps1")); cpSync(resolve(appRoot, "scripts", "install", "install.ps1"), resolve(websitePublicDir, "install.ps1"));
cpSync(resolve(appRoot, "scripts", "install", "install-skills.sh"), resolve(websitePublicDir, "install-skills"));
cpSync(resolve(appRoot, "scripts", "install", "install-skills.ps1"), resolve(websitePublicDir, "install-skills.ps1"));
console.log("[feynman] synced website installers"); console.log("[feynman] synced website installers");

View File

@@ -11,7 +11,7 @@ Use the `alpha` CLI via bash for all paper research operations.
| Command | Description | | Command | Description |
|---------|-------------| |---------|-------------|
| `alpha search "<query>"` | Search papers. Modes: `--mode semantic`, `--mode keyword`, `--mode agentic` | | `alpha search "<query>"` | Search papers. Prefer `--mode semantic` by default; use `--mode keyword` only for exact-term lookup and `--mode agentic` for broader retrieval. |
| `alpha get <arxiv-id-or-url>` | Fetch paper content and any local annotation | | `alpha get <arxiv-id-or-url>` | Fetch paper content and any local annotation |
| `alpha get --full-text <arxiv-id>` | Get raw full text instead of AI report | | `alpha get --full-text <arxiv-id>` | Get raw full text instead of AI report |
| `alpha ask <arxiv-id> "<question>"` | Ask a question about a paper's PDF | | `alpha ask <arxiv-id> "<question>"` | Ask a question about a paper's PDF |
@@ -22,7 +22,7 @@ Use the `alpha` CLI via bash for all paper research operations.
## Auth ## Auth
Run `alpha login` to authenticate with alphaXiv. Check status with `alpha status`. Run `alpha login` to authenticate with alphaXiv. Check status with `feynman alpha status`, or `alpha status` once your installed `alpha-hub` version includes it.
## Examples ## Examples

View File

@@ -5,7 +5,7 @@ description: Autonomous experiment loop that tries ideas, measures results, keep
# Autoresearch # Autoresearch
Run the `/autoresearch` workflow. Read the prompt template at `prompts/autoresearch.md` for the full procedure. Run the `/autoresearch` workflow. Read the prompt template at `../prompts/autoresearch.md` for the full procedure.
Tools used: `init_experiment`, `run_experiment`, `log_experiment` (from pi-autoresearch) Tools used: `init_experiment`, `run_experiment`, `log_experiment` (from pi-autoresearch)

View File

@@ -0,0 +1,28 @@
---
name: contributing
description: Contribute changes to the Feynman repository itself. Use when the task is to add features, fix bugs, update prompts or skills, change install or release behavior, improve docs, or prepare a focused PR against this repo.
---
# Contributing
Read `../CONTRIBUTING.md` first, then `../AGENTS.md` for repo-level agent conventions.
Use this skill when working on Feynman itself, especially for:
- CLI or runtime changes in `src/`
- prompt changes in `prompts/`
- bundled skill changes in `skills/`
- subagent behavior changes in `.feynman/agents/`
- install, packaging, or release changes in `scripts/`, `README.md`, or website docs
Minimum local checks before claiming the repo change is done:
```bash
npm test
npm run typecheck
npm run build
```
If the docs site changed, also validate `website/`.
When changing release-sensitive behavior, verify that `.nvmrc`, package `engines`, runtime guards, and install docs stay aligned.

View File

@@ -5,7 +5,7 @@ description: Run a thorough, source-heavy investigation on any topic. Use when t
# Deep Research # Deep Research
Run the `/deepresearch` workflow. Read the prompt template at `prompts/deepresearch.md` for the full procedure. Run the `/deepresearch` workflow. Read the prompt template at `../prompts/deepresearch.md` for the full procedure.
Agents used: `researcher`, `verifier`, `reviewer` Agents used: `researcher`, `verifier`, `reviewer`

25
skills/eli5/SKILL.md Normal file
View File

@@ -0,0 +1,25 @@
---
name: eli5
description: Explain research, papers, or technical ideas in plain English with minimal jargon, concrete analogies, and clear takeaways. Use when the user says "ELI5 this", asks for a simple explanation of a paper or research result, wants jargon removed, or asks what something technically dense actually means.
---
# ELI5
Use `alpha` first when the user names a specific paper, arXiv id, DOI, or paper URL.
If the user gives only a topic, identify 1-3 representative papers and anchor the explanation around the clearest or most important one.
Structure the answer with:
- `One-Sentence Summary`
- `Big Idea`
- `How It Works`
- `Why It Matters`
- `What To Be Skeptical Of`
- `If You Remember 3 Things`
Guidelines:
- Use short sentences and concrete words.
- Define jargon immediately or remove it.
- Prefer one good analogy over several weak ones.
- Separate what the paper actually shows from speculation or interpretation.
- Keep the explanation inline unless the user explicitly asks to save it as an artifact.

View File

@@ -5,6 +5,6 @@ description: Inspect active background research work including running processes
# Jobs # Jobs
Run the `/jobs` workflow. Read the prompt template at `prompts/jobs.md` for the full procedure. Run the `/jobs` workflow. Read the prompt template at `../prompts/jobs.md` for the full procedure.
Shows active `pi-processes`, scheduled `pi-schedule-prompt` entries, and running subagent tasks. Shows active `pi-processes`, scheduled `pi-schedule-prompt` entries, and running subagent tasks.

View File

@@ -5,7 +5,7 @@ description: Run a literature review using paper search and primary-source synth
# Literature Review # Literature Review
Run the `/lit` workflow. Read the prompt template at `prompts/lit.md` for the full procedure. Run the `/lit` workflow. Read the prompt template at `../prompts/lit.md` for the full procedure.
Agents used: `researcher`, `verifier`, `reviewer` Agents used: `researcher`, `verifier`, `reviewer`

View File

@@ -5,7 +5,7 @@ description: Compare a paper's claims against its public codebase. Use when the
# Paper-Code Audit # Paper-Code Audit
Run the `/audit` workflow. Read the prompt template at `prompts/audit.md` for the full procedure. Run the `/audit` workflow. Read the prompt template at `../prompts/audit.md` for the full procedure.
Agents used: `researcher`, `verifier` Agents used: `researcher`, `verifier`

View File

@@ -5,7 +5,7 @@ description: Turn research findings into a polished paper-style draft with secti
# Paper Writing # Paper Writing
Run the `/draft` workflow. Read the prompt template at `prompts/draft.md` for the full procedure. Run the `/draft` workflow. Read the prompt template at `../prompts/draft.md` for the full procedure.
Agents used: `writer`, `verifier` Agents used: `writer`, `verifier`

View File

@@ -5,7 +5,7 @@ description: Simulate a tough but constructive peer review of an AI research art
# Peer Review # Peer Review
Run the `/review` workflow. Read the prompt template at `prompts/review.md` for the full procedure. Run the `/review` workflow. Read the prompt template at `../prompts/review.md` for the full procedure.
Agents used: `researcher`, `reviewer` Agents used: `researcher`, `reviewer`

View File

@@ -5,7 +5,7 @@ description: Plan or execute a replication of a paper, claim, or benchmark. Use
# Replication # Replication
Run the `/replicate` workflow. Read the prompt template at `prompts/replicate.md` for the full procedure. Run the `/replicate` workflow. Read the prompt template at `../prompts/replicate.md` for the full procedure.
Agents used: `researcher` Agents used: `researcher`

View File

@@ -5,6 +5,6 @@ description: Write a durable session log capturing completed work, findings, ope
# Session Log # Session Log
Run the `/log` workflow. Read the prompt template at `prompts/log.md` for the full procedure. Run the `/log` workflow. Read the prompt template at `../prompts/log.md` for the full procedure.
Output: session log in `notes/session-logs/`. Output: session log in `notes/session-logs/`.

View File

@@ -5,7 +5,7 @@ description: Compare multiple sources on a topic and produce a grounded comparis
# Source Comparison # Source Comparison
Run the `/compare` workflow. Read the prompt template at `prompts/compare.md` for the full procedure. Run the `/compare` workflow. Read the prompt template at `../prompts/compare.md` for the full procedure.
Agents used: `researcher`, `verifier` Agents used: `researcher`, `verifier`

View File

@@ -5,7 +5,7 @@ description: Set up a recurring research watch on a topic, company, paper area,
# Watch # Watch
Run the `/watch` workflow. Read the prompt template at `prompts/watch.md` for the full procedure. Run the `/watch` workflow. Read the prompt template at `../prompts/watch.md` for the full procedure.
Agents used: `researcher` Agents used: `researcher`

View File

@@ -1,5 +1,5 @@
import { createHash } from "node:crypto"; import { createHash } from "node:crypto";
import { existsSync, mkdirSync, readdirSync, readFileSync, writeFileSync } from "node:fs"; import { existsSync, mkdirSync, readdirSync, readFileSync, rmSync, writeFileSync } from "node:fs";
import { dirname, relative, resolve } from "node:path"; import { dirname, relative, resolve } from "node:path";
import { getBootstrapStatePath } from "../config/paths.js"; import { getBootstrapStatePath } from "../config/paths.js";
@@ -64,27 +64,76 @@ function listFiles(root: string): string[] {
return files.sort(); return files.sort();
} }
function removeEmptyParentDirectories(path: string, stopAt: string): void {
let current = dirname(path);
while (current.startsWith(stopAt) && current !== stopAt) {
if (!existsSync(current)) {
current = dirname(current);
continue;
}
if (readdirSync(current).length > 0) {
return;
}
rmSync(current, { recursive: true, force: true });
current = dirname(current);
}
}
function syncManagedFiles( function syncManagedFiles(
sourceRoot: string, sourceRoot: string,
targetRoot: string, targetRoot: string,
scope: string,
state: BootstrapState, state: BootstrapState,
result: BootstrapSyncResult, result: BootstrapSyncResult,
): void { ): void {
const sourcePaths = new Set(listFiles(sourceRoot).map((sourcePath) => relative(sourceRoot, sourcePath)));
for (const targetPath of listFiles(targetRoot)) {
const key = relative(targetRoot, targetPath);
if (sourcePaths.has(key)) continue;
const scopedKey = `${scope}:${key}`;
const previous = state.files[scopedKey] ?? state.files[key];
if (!previous) {
continue;
}
if (!existsSync(targetPath)) {
delete state.files[scopedKey];
delete state.files[key];
continue;
}
const currentTargetText = readFileSync(targetPath, "utf8");
const currentTargetHash = sha256(currentTargetText);
if (currentTargetHash !== previous.lastAppliedTargetHash) {
result.skipped.push(key);
continue;
}
rmSync(targetPath, { force: true });
removeEmptyParentDirectories(targetPath, targetRoot);
delete state.files[scopedKey];
delete state.files[key];
}
for (const sourcePath of listFiles(sourceRoot)) { for (const sourcePath of listFiles(sourceRoot)) {
const key = relative(sourceRoot, sourcePath); const key = relative(sourceRoot, sourcePath);
const targetPath = resolve(targetRoot, key); const targetPath = resolve(targetRoot, key);
const sourceText = readFileSync(sourcePath, "utf8"); const sourceText = readFileSync(sourcePath, "utf8");
const sourceHash = sha256(sourceText); const sourceHash = sha256(sourceText);
const previous = state.files[key]; const scopedKey = `${scope}:${key}`;
const previous = state.files[scopedKey] ?? state.files[key];
mkdirSync(dirname(targetPath), { recursive: true }); mkdirSync(dirname(targetPath), { recursive: true });
if (!existsSync(targetPath)) { if (!existsSync(targetPath)) {
writeFileSync(targetPath, sourceText, "utf8"); writeFileSync(targetPath, sourceText, "utf8");
state.files[key] = { state.files[scopedKey] = {
lastAppliedSourceHash: sourceHash, lastAppliedSourceHash: sourceHash,
lastAppliedTargetHash: sourceHash, lastAppliedTargetHash: sourceHash,
}; };
delete state.files[key];
result.copied.push(key); result.copied.push(key);
continue; continue;
} }
@@ -93,10 +142,11 @@ function syncManagedFiles(
const currentTargetHash = sha256(currentTargetText); const currentTargetHash = sha256(currentTargetText);
if (currentTargetHash === sourceHash) { if (currentTargetHash === sourceHash) {
state.files[key] = { state.files[scopedKey] = {
lastAppliedSourceHash: sourceHash, lastAppliedSourceHash: sourceHash,
lastAppliedTargetHash: currentTargetHash, lastAppliedTargetHash: currentTargetHash,
}; };
delete state.files[key];
continue; continue;
} }
@@ -111,10 +161,11 @@ function syncManagedFiles(
} }
writeFileSync(targetPath, sourceText, "utf8"); writeFileSync(targetPath, sourceText, "utf8");
state.files[key] = { state.files[scopedKey] = {
lastAppliedSourceHash: sourceHash, lastAppliedSourceHash: sourceHash,
lastAppliedTargetHash: sourceHash, lastAppliedTargetHash: sourceHash,
}; };
delete state.files[key];
result.updated.push(key); result.updated.push(key);
} }
} }
@@ -128,9 +179,9 @@ export function syncBundledAssets(appRoot: string, agentDir: string): BootstrapS
skipped: [], skipped: [],
}; };
syncManagedFiles(resolve(appRoot, ".feynman", "themes"), resolve(agentDir, "themes"), state, result); syncManagedFiles(resolve(appRoot, ".feynman", "themes"), resolve(agentDir, "themes"), "themes", state, result);
syncManagedFiles(resolve(appRoot, ".feynman", "agents"), resolve(agentDir, "agents"), state, result); syncManagedFiles(resolve(appRoot, ".feynman", "agents"), resolve(agentDir, "agents"), "agents", state, result);
syncManagedFiles(resolve(appRoot, "skills"), resolve(agentDir, "skills"), state, result); syncManagedFiles(resolve(appRoot, "skills"), resolve(agentDir, "skills"), "skills", state, result);
writeBootstrapState(statePath, state); writeBootstrapState(statePath, state);
return result; return result;

View File

@@ -11,25 +11,30 @@ import {
login as loginAlpha, login as loginAlpha,
logout as logoutAlpha, logout as logoutAlpha,
} from "@companion-ai/alpha-hub/lib"; } from "@companion-ai/alpha-hub/lib";
import { AuthStorage, DefaultPackageManager, ModelRegistry, SettingsManager } from "@mariozechner/pi-coding-agent"; import { DefaultPackageManager, SettingsManager } from "@mariozechner/pi-coding-agent";
import { syncBundledAssets } from "./bootstrap/sync.js"; import { syncBundledAssets } from "./bootstrap/sync.js";
import { ensureFeynmanHome, getDefaultSessionDir, getFeynmanAgentDir, getFeynmanHome } from "./config/paths.js"; import { ensureFeynmanHome, getDefaultSessionDir, getFeynmanAgentDir, getFeynmanHome } from "./config/paths.js";
import { launchPiChat } from "./pi/launch.js"; import { launchPiChat } from "./pi/launch.js";
import { CORE_PACKAGE_SOURCES, getOptionalPackagePresetSources, listOptionalPackagePresets } from "./pi/package-presets.js"; import { CORE_PACKAGE_SOURCES, getOptionalPackagePresetSources, listOptionalPackagePresets } from "./pi/package-presets.js";
import { normalizeFeynmanSettings, normalizeThinkingLevel, parseModelSpec } from "./pi/settings.js"; import { normalizeFeynmanSettings, normalizeThinkingLevel, parseModelSpec } from "./pi/settings.js";
import { applyFeynmanPackageManagerEnv } from "./pi/runtime.js";
import { getConfiguredServiceTier, normalizeServiceTier, setConfiguredServiceTier } from "./model/service-tier.js";
import { import {
authenticateModelProvider,
getCurrentModelSpec, getCurrentModelSpec,
loginModelProvider, loginModelProvider,
logoutModelProvider, logoutModelProvider,
printModelList, printModelList,
setDefaultModelSpec, setDefaultModelSpec,
} from "./model/commands.js"; } from "./model/commands.js";
import { printSearchStatus } from "./search/commands.js"; import { clearSearchConfig, printSearchStatus, setSearchProvider } from "./search/commands.js";
import type { PiWebSearchProvider } from "./pi/web-access.js";
import { runDoctor, runStatus } from "./setup/doctor.js"; import { runDoctor, runStatus } from "./setup/doctor.js";
import { setupPreviewDependencies } from "./setup/preview.js"; import { setupPreviewDependencies } from "./setup/preview.js";
import { runSetup } from "./setup/setup.js"; import { runSetup } from "./setup/setup.js";
import { ASH, printAsciiHeader, printInfo, printPanel, printSection, RESET, SAGE } from "./ui/terminal.js"; import { ASH, printAsciiHeader, printInfo, printPanel, printSection, RESET, SAGE } from "./ui/terminal.js";
import { createModelRegistry } from "./model/registry.js";
import { import {
cliCommandSections, cliCommandSections,
formatCliWorkflowUsage, formatCliWorkflowUsage,
@@ -124,7 +129,13 @@ async function handleModelCommand(subcommand: string | undefined, args: string[]
} }
if (subcommand === "login") { if (subcommand === "login") {
await loginModelProvider(feynmanAuthPath, args[0], feynmanSettingsPath); if (args[0]) {
// Specific provider given - use OAuth login directly
await loginModelProvider(feynmanAuthPath, args[0], feynmanSettingsPath);
} else {
// No provider specified - show auth method choice
await authenticateModelProvider(feynmanAuthPath, feynmanSettingsPath);
}
return; return;
} }
@@ -142,10 +153,34 @@ async function handleModelCommand(subcommand: string | undefined, args: string[]
return; return;
} }
if (subcommand === "tier") {
const requested = args[0];
if (!requested) {
console.log(getConfiguredServiceTier(feynmanSettingsPath) ?? "not set");
return;
}
if (requested === "unset" || requested === "clear" || requested === "off") {
setConfiguredServiceTier(feynmanSettingsPath, undefined);
console.log("Cleared service tier override");
return;
}
const tier = normalizeServiceTier(requested);
if (!tier) {
throw new Error("Usage: feynman model tier <auto|default|flex|priority|standard_only|unset>");
}
setConfiguredServiceTier(feynmanSettingsPath, tier);
console.log(`Service tier set to ${tier}`);
return;
}
throw new Error(`Unknown model command: ${subcommand}`); throw new Error(`Unknown model command: ${subcommand}`);
} }
async function handleUpdateCommand(workingDir: string, feynmanAgentDir: string, source?: string): Promise<void> { async function handleUpdateCommand(workingDir: string, feynmanAgentDir: string, source?: string): Promise<void> {
applyFeynmanPackageManagerEnv(feynmanAgentDir);
const settingsManager = SettingsManager.create(workingDir, feynmanAgentDir); const settingsManager = SettingsManager.create(workingDir, feynmanAgentDir);
const packageManager = new DefaultPackageManager({ const packageManager = new DefaultPackageManager({
cwd: workingDir, cwd: workingDir,
@@ -169,6 +204,7 @@ async function handleUpdateCommand(workingDir: string, feynmanAgentDir: string,
} }
async function handlePackagesCommand(subcommand: string | undefined, args: string[], workingDir: string, feynmanAgentDir: string): Promise<void> { async function handlePackagesCommand(subcommand: string | undefined, args: string[], workingDir: string, feynmanAgentDir: string): Promise<void> {
applyFeynmanPackageManagerEnv(feynmanAgentDir);
const settingsManager = SettingsManager.create(workingDir, feynmanAgentDir); const settingsManager = SettingsManager.create(workingDir, feynmanAgentDir);
const configuredSources = new Set( const configuredSources = new Set(
settingsManager settingsManager
@@ -234,12 +270,27 @@ async function handlePackagesCommand(subcommand: string | undefined, args: strin
console.log("Optional packages installed."); console.log("Optional packages installed.");
} }
function handleSearchCommand(subcommand: string | undefined): void { function handleSearchCommand(subcommand: string | undefined, args: string[]): void {
if (!subcommand || subcommand === "status") { if (!subcommand || subcommand === "status") {
printSearchStatus(); printSearchStatus();
return; return;
} }
if (subcommand === "set") {
const provider = args[0] as PiWebSearchProvider | undefined;
const validProviders: PiWebSearchProvider[] = ["auto", "perplexity", "exa", "gemini"];
if (!provider || !validProviders.includes(provider)) {
throw new Error("Usage: feynman search set <auto|perplexity|exa|gemini> [api-key]");
}
setSearchProvider(provider, args[1]);
return;
}
if (subcommand === "clear") {
clearSearchConfig();
return;
}
throw new Error(`Unknown search command: ${subcommand}`); throw new Error(`Unknown search command: ${subcommand}`);
} }
@@ -297,9 +348,11 @@ export async function main(): Promise<void> {
"alpha-login": { type: "boolean" }, "alpha-login": { type: "boolean" },
"alpha-logout": { type: "boolean" }, "alpha-logout": { type: "boolean" },
"alpha-status": { type: "boolean" }, "alpha-status": { type: "boolean" },
mode: { type: "string" },
model: { type: "string" }, model: { type: "string" },
"new-session": { type: "boolean" }, "new-session": { type: "boolean" },
prompt: { type: "string" }, prompt: { type: "string" },
"service-tier": { type: "string" },
"session-dir": { type: "string" }, "session-dir": { type: "string" },
"setup-preview": { type: "boolean" }, "setup-preview": { type: "boolean" },
thinking: { type: "string" }, thinking: { type: "string" },
@@ -406,7 +459,7 @@ export async function main(): Promise<void> {
} }
if (command === "search") { if (command === "search") {
handleSearchCommand(rest[0]); handleSearchCommand(rest[0], rest.slice(1));
return; return;
} }
@@ -426,8 +479,19 @@ export async function main(): Promise<void> {
} }
const explicitModelSpec = values.model ?? process.env.FEYNMAN_MODEL; const explicitModelSpec = values.model ?? process.env.FEYNMAN_MODEL;
const explicitServiceTier = normalizeServiceTier(values["service-tier"] ?? process.env.FEYNMAN_SERVICE_TIER);
const mode = values.mode;
if (mode !== undefined && mode !== "text" && mode !== "json" && mode !== "rpc") {
throw new Error("Unknown mode. Use text, json, or rpc.");
}
if ((values["service-tier"] ?? process.env.FEYNMAN_SERVICE_TIER) && !explicitServiceTier) {
throw new Error("Unknown service tier. Use auto, default, flex, priority, or standard_only.");
}
if (explicitServiceTier) {
process.env.FEYNMAN_SERVICE_TIER = explicitServiceTier;
}
if (explicitModelSpec) { if (explicitModelSpec) {
const modelRegistry = new ModelRegistry(AuthStorage.create(feynmanAuthPath)); const modelRegistry = createModelRegistry(feynmanAuthPath);
const explicitModel = parseModelSpec(explicitModelSpec, modelRegistry); const explicitModel = parseModelSpec(explicitModelSpec, modelRegistry);
if (!explicitModel) { if (!explicitModel) {
throw new Error(`Unknown model: ${explicitModelSpec}`); throw new Error(`Unknown model: ${explicitModelSpec}`);
@@ -456,6 +520,7 @@ export async function main(): Promise<void> {
sessionDir, sessionDir,
feynmanAgentDir, feynmanAgentDir,
feynmanVersion, feynmanVersion,
mode,
thinkingLevel, thinkingLevel,
explicitModelSpec, explicitModelSpec,
oneShotPrompt: values.prompt, oneShotPrompt: values.prompt,

View File

@@ -1,6 +1,12 @@
import { main } from "./cli.js"; import { ensureSupportedNodeVersion } from "./system/node-version.js";
main().catch((error) => { async function run(): Promise<void> {
ensureSupportedNodeVersion();
const { main } = await import("./cli.js");
await main();
}
run().catch((error) => {
console.error(error instanceof Error ? error.message : String(error)); console.error(error instanceof Error ? error.message : String(error));
process.exitCode = 1; process.exitCode = 1;
}); });

View File

@@ -1,4 +1,4 @@
import { AuthStorage, ModelRegistry } from "@mariozechner/pi-coding-agent"; import { createModelRegistry } from "./registry.js";
type ModelRecord = { type ModelRecord = {
provider: string; provider: string;
@@ -95,6 +95,14 @@ const RESEARCH_MODEL_PREFERENCES = [
spec: "zai/glm-5", spec: "zai/glm-5",
reason: "good fallback when GLM is the available research model", reason: "good fallback when GLM is the available research model",
}, },
{
spec: "minimax/minimax-m2.7",
reason: "good fallback when MiniMax is the available research model",
},
{
spec: "minimax/minimax-m2.7-highspeed",
reason: "good fallback when MiniMax is the available research model",
},
{ {
spec: "kimi-coding/kimi-k2-thinking", spec: "kimi-coding/kimi-k2-thinking",
reason: "good fallback when Kimi is the available research model", reason: "good fallback when Kimi is the available research model",
@@ -166,10 +174,6 @@ function sortProviders(left: ProviderStatus, right: ProviderStatus): number {
return left.label.localeCompare(right.label); return left.label.localeCompare(right.label);
} }
function createModelRegistry(authPath: string): ModelRegistry {
return new ModelRegistry(AuthStorage.create(authPath));
}
export function getAvailableModelRecords(authPath: string): ModelRecord[] { export function getAvailableModelRecords(authPath: string): ModelRecord[] {
return createModelRegistry(authPath) return createModelRegistry(authPath)
.getAvailable() .getAvailable()
@@ -258,7 +262,9 @@ export function buildModelStatusSnapshotFromRecords(
const guidance: string[] = []; const guidance: string[] = [];
if (available.length === 0) { if (available.length === 0) {
guidance.push("No authenticated Pi models are available yet."); guidance.push("No authenticated Pi models are available yet.");
guidance.push("Run `feynman model login <provider>` or add provider credentials that Pi can see."); guidance.push(
"Run `feynman model login <provider>` (OAuth) or configure an API key (env var, auth.json, or models.json for custom providers).",
);
guidance.push("After auth is in place, rerun `feynman model list` or `feynman setup model`."); guidance.push("After auth is in place, rerun `feynman model list` or `feynman setup model`.");
} else if (!current) { } else if (!current) {
guidance.push(`No default research model is set. Recommended: ${recommended?.spec}.`); guidance.push(`No default research model is set. Recommended: ${recommended?.spec}.`);

View File

@@ -1,8 +1,11 @@
import { AuthStorage } from "@mariozechner/pi-coding-agent"; import { AuthStorage } from "@mariozechner/pi-coding-agent";
import { writeFileSync } from "node:fs"; import { writeFileSync } from "node:fs";
import { exec as execCallback } from "node:child_process";
import { promisify } from "node:util";
import { readJson } from "../pi/settings.js"; import { readJson } from "../pi/settings.js";
import { promptChoice, promptText } from "../setup/prompts.js"; import { promptChoice, promptText } from "../setup/prompts.js";
import { openUrl } from "../system/open-url.js";
import { printInfo, printSection, printSuccess, printWarning } from "../ui/terminal.js"; import { printInfo, printSection, printSuccess, printWarning } from "../ui/terminal.js";
import { import {
buildModelStatusSnapshotFromRecords, buildModelStatusSnapshotFromRecords,
@@ -11,6 +14,10 @@ import {
getSupportedModelRecords, getSupportedModelRecords,
type ModelStatusSnapshot, type ModelStatusSnapshot,
} from "./catalog.js"; } from "./catalog.js";
import { createModelRegistry, getModelsJsonPath } from "./registry.js";
import { upsertProviderBaseUrl, upsertProviderConfig } from "./models-json.js";
const exec = promisify(execCallback);
function collectModelStatus(settingsPath: string, authPath: string): ModelStatusSnapshot { function collectModelStatus(settingsPath: string, authPath: string): ModelStatusSnapshot {
return buildModelStatusSnapshotFromRecords( return buildModelStatusSnapshotFromRecords(
@@ -57,6 +64,453 @@ async function selectOAuthProvider(authPath: string, action: "login" | "logout")
return providers[selection]; return providers[selection];
} }
type ApiKeyProviderInfo = {
id: string;
label: string;
envVar?: string;
};
const API_KEY_PROVIDERS: ApiKeyProviderInfo[] = [
{ id: "__custom__", label: "Custom provider (baseUrl + API key)" },
{ id: "openai", label: "OpenAI Platform API", envVar: "OPENAI_API_KEY" },
{ id: "anthropic", label: "Anthropic API", envVar: "ANTHROPIC_API_KEY" },
{ id: "google", label: "Google Gemini API", envVar: "GEMINI_API_KEY" },
{ id: "openrouter", label: "OpenRouter", envVar: "OPENROUTER_API_KEY" },
{ id: "zai", label: "Z.AI / GLM", envVar: "ZAI_API_KEY" },
{ id: "kimi-coding", label: "Kimi / Moonshot", envVar: "KIMI_API_KEY" },
{ id: "minimax", label: "MiniMax", envVar: "MINIMAX_API_KEY" },
{ id: "minimax-cn", label: "MiniMax (China)", envVar: "MINIMAX_CN_API_KEY" },
{ id: "mistral", label: "Mistral", envVar: "MISTRAL_API_KEY" },
{ id: "groq", label: "Groq", envVar: "GROQ_API_KEY" },
{ id: "xai", label: "xAI", envVar: "XAI_API_KEY" },
{ id: "cerebras", label: "Cerebras", envVar: "CEREBRAS_API_KEY" },
{ id: "vercel-ai-gateway", label: "Vercel AI Gateway", envVar: "AI_GATEWAY_API_KEY" },
{ id: "huggingface", label: "Hugging Face", envVar: "HF_TOKEN" },
{ id: "opencode", label: "OpenCode Zen", envVar: "OPENCODE_API_KEY" },
{ id: "opencode-go", label: "OpenCode Go", envVar: "OPENCODE_API_KEY" },
{ id: "azure-openai-responses", label: "Azure OpenAI (Responses)", envVar: "AZURE_OPENAI_API_KEY" },
];
async function selectApiKeyProvider(): Promise<ApiKeyProviderInfo | undefined> {
const choices = API_KEY_PROVIDERS.map(
(provider) => `${provider.id}${provider.label}${provider.envVar ? ` (${provider.envVar})` : ""}`,
);
choices.push("Cancel");
const selection = await promptChoice("Choose an API-key provider:", choices, 0);
if (selection >= API_KEY_PROVIDERS.length) {
return undefined;
}
return API_KEY_PROVIDERS[selection];
}
type CustomProviderSetup = {
providerId: string;
modelIds: string[];
baseUrl: string;
api: "openai-completions" | "openai-responses" | "anthropic-messages" | "google-generative-ai";
apiKeyConfig: string;
/**
* If true, add `Authorization: Bearer <apiKey>` to requests in addition to
* whatever the API mode uses (useful for proxies that implement /v1/messages
* but expect Bearer auth instead of x-api-key).
*/
authHeader: boolean;
};
function normalizeProviderId(value: string): string {
return value.trim().toLowerCase().replace(/\s+/g, "-");
}
function normalizeModelIds(value: string): string[] {
const items = value
.split(",")
.map((entry) => entry.trim())
.filter(Boolean);
return Array.from(new Set(items));
}
function normalizeBaseUrl(value: string): string {
return value.trim().replace(/\/+$/, "");
}
function normalizeCustomProviderBaseUrl(
api: CustomProviderSetup["api"],
baseUrl: string,
): { baseUrl: string; note?: string } {
const normalized = normalizeBaseUrl(baseUrl);
if (!normalized) {
return { baseUrl: normalized };
}
// Pi expects Anthropic baseUrl without `/v1` (it appends `/v1/messages` internally).
if (api === "anthropic-messages" && /\/v1$/i.test(normalized)) {
return { baseUrl: normalized.replace(/\/v1$/i, ""), note: "Stripped trailing /v1 for Anthropic mode." };
}
return { baseUrl: normalized };
}
function isLocalBaseUrl(baseUrl: string): boolean {
return /^(https?:\/\/)?(localhost|127\.0\.0\.1|0\.0\.0\.0)(:|\/|$)/i.test(baseUrl);
}
async function resolveApiKeyConfig(apiKeyConfig: string): Promise<string | undefined> {
const trimmed = apiKeyConfig.trim();
if (!trimmed) return undefined;
if (trimmed.startsWith("!")) {
const command = trimmed.slice(1).trim();
if (!command) return undefined;
const shell = process.platform === "win32" ? process.env.ComSpec || "cmd.exe" : process.env.SHELL || "/bin/sh";
try {
const { stdout } = await exec(command, { shell, maxBuffer: 1024 * 1024 });
const value = stdout.trim();
return value || undefined;
} catch {
return undefined;
}
}
const envValue = process.env[trimmed];
if (typeof envValue === "string" && envValue.trim()) {
return envValue.trim();
}
// Fall back to literal value.
return trimmed;
}
async function bestEffortFetchOpenAiModelIds(
baseUrl: string,
apiKey: string,
authHeader: boolean,
): Promise<string[] | undefined> {
const url = `${baseUrl}/models`;
const controller = new AbortController();
const timer = setTimeout(() => controller.abort(), 5000);
try {
const response = await fetch(url, {
method: "GET",
headers: authHeader ? { Authorization: `Bearer ${apiKey}` } : undefined,
signal: controller.signal,
});
if (!response.ok) {
return undefined;
}
const json = (await response.json()) as any;
if (!Array.isArray(json?.data)) return undefined;
return json.data
.map((entry: any) => (typeof entry?.id === "string" ? entry.id : undefined))
.filter(Boolean);
} catch {
return undefined;
} finally {
clearTimeout(timer);
}
}
async function promptCustomProviderSetup(): Promise<CustomProviderSetup | undefined> {
printSection("Custom Provider");
const providerIdInput = await promptText("Provider id (e.g. my-proxy)", "custom");
const providerId = normalizeProviderId(providerIdInput);
if (!providerId || providerId === "__custom__") {
printWarning("Invalid provider id.");
return undefined;
}
const apiChoices = [
"openai-completions — OpenAI Chat Completions compatible (e.g. /v1/chat/completions)",
"openai-responses — OpenAI Responses compatible (e.g. /v1/responses)",
"anthropic-messages — Anthropic Messages compatible (e.g. /v1/messages)",
"google-generative-ai — Google Generative AI compatible (generativelanguage.googleapis.com)",
"Cancel",
];
const apiSelection = await promptChoice("API mode:", apiChoices, 0);
if (apiSelection >= 4) {
return undefined;
}
const api = ["openai-completions", "openai-responses", "anthropic-messages", "google-generative-ai"][apiSelection] as CustomProviderSetup["api"];
const baseUrlDefault = ((): string => {
if (api === "openai-completions" || api === "openai-responses") return "http://localhost:11434/v1";
if (api === "anthropic-messages") return "https://api.anthropic.com";
if (api === "google-generative-ai") return "https://generativelanguage.googleapis.com";
return "http://localhost:11434/v1";
})();
const baseUrlPrompt =
api === "openai-completions" || api === "openai-responses"
? "Base URL (include /v1 for OpenAI-compatible endpoints)"
: api === "anthropic-messages"
? "Base URL (no trailing /, no /v1)"
: "Base URL (no trailing /)";
const baseUrlRaw = await promptText(baseUrlPrompt, baseUrlDefault);
const { baseUrl, note: baseUrlNote } = normalizeCustomProviderBaseUrl(api, baseUrlRaw);
if (!baseUrl) {
printWarning("Base URL is required.");
return undefined;
}
if (baseUrlNote) {
printInfo(baseUrlNote);
}
let authHeader = false;
if (api === "openai-completions" || api === "openai-responses") {
const defaultAuthHeader = !isLocalBaseUrl(baseUrl);
const authHeaderChoices = [
"Yes (send Authorization: Bearer <apiKey>)",
"No (common for local Ollama/vLLM/LM Studio)",
"Cancel",
];
const authHeaderSelection = await promptChoice(
"Send Authorization header?",
authHeaderChoices,
defaultAuthHeader ? 0 : 1,
);
if (authHeaderSelection >= 2) {
return undefined;
}
authHeader = authHeaderSelection === 0;
}
if (api === "anthropic-messages") {
const defaultAuthHeader = isLocalBaseUrl(baseUrl);
const authHeaderChoices = [
"Yes (also send Authorization: Bearer <apiKey>)",
"No (standard Anthropic uses x-api-key only)",
"Cancel",
];
const authHeaderSelection = await promptChoice(
"Also send Authorization header?",
authHeaderChoices,
defaultAuthHeader ? 0 : 1,
);
if (authHeaderSelection >= 2) {
return undefined;
}
authHeader = authHeaderSelection === 0;
}
printInfo("API key value supports:");
printInfo(" - literal secret (stored in models.json)");
printInfo(" - env var name (resolved at runtime)");
printInfo(" - !command (executes and uses stdout)");
const apiKeyConfigRaw = (await promptText("API key / resolver", "")).trim();
const apiKeyConfig = apiKeyConfigRaw || "local";
if (!apiKeyConfigRaw) {
printInfo("Using placeholder apiKey value (required by Pi for custom providers).");
}
let modelIdsDefault = "my-model";
if (api === "openai-completions" || api === "openai-responses") {
// Best-effort: hit /models so users can pick correct ids (especially for proxies).
const resolvedKey = await resolveApiKeyConfig(apiKeyConfig);
const modelIds = resolvedKey ? await bestEffortFetchOpenAiModelIds(baseUrl, resolvedKey, authHeader) : undefined;
if (modelIds && modelIds.length > 0) {
const sample = modelIds.slice(0, 10).join(", ");
printInfo(`Detected models: ${sample}${modelIds.length > 10 ? ", ..." : ""}`);
modelIdsDefault = modelIds.includes("sonnet") ? "sonnet" : modelIds[0]!;
}
}
const modelIdsRaw = await promptText("Model id(s) (comma-separated)", modelIdsDefault);
const modelIds = normalizeModelIds(modelIdsRaw);
if (modelIds.length === 0) {
printWarning("At least one model id is required.");
return undefined;
}
return { providerId, modelIds, baseUrl, api, apiKeyConfig, authHeader };
}
async function verifyCustomProvider(setup: CustomProviderSetup, authPath: string): Promise<void> {
const registry = createModelRegistry(authPath);
const modelsError = registry.getError();
if (modelsError) {
printWarning("Verification: models.json failed to load.");
for (const line of modelsError.split("\n")) {
printInfo(` ${line}`);
}
return;
}
const all = registry.getAll();
const hasModel = setup.modelIds.some((id) => all.some((model) => model.provider === setup.providerId && model.id === id));
if (!hasModel) {
printWarning("Verification: model registry does not contain the configured provider/model ids.");
return;
}
const available = registry.getAvailable();
const hasAvailable = setup.modelIds.some((id) =>
available.some((model) => model.provider === setup.providerId && model.id === id),
);
if (!hasAvailable) {
printWarning("Verification: provider is not considered authenticated/available.");
return;
}
const apiKey = await registry.getApiKeyForProvider(setup.providerId);
if (!apiKey) {
printWarning("Verification: API key could not be resolved (check env var name / !command).");
return;
}
const timeoutMs = 8000;
// Best-effort network check for OpenAI-compatible endpoints
if (setup.api === "openai-completions" || setup.api === "openai-responses") {
const url = `${setup.baseUrl}/models`;
const controller = new AbortController();
const timer = setTimeout(() => controller.abort(), timeoutMs);
try {
const response = await fetch(url, {
method: "GET",
headers: setup.authHeader ? { Authorization: `Bearer ${apiKey}` } : undefined,
signal: controller.signal,
});
if (!response.ok) {
printWarning(`Verification: ${url} returned ${response.status} ${response.statusText}`);
return;
}
const json = (await response.json()) as unknown;
const modelIds = Array.isArray((json as any)?.data)
? (json as any).data.map((entry: any) => (typeof entry?.id === "string" ? entry.id : undefined)).filter(Boolean)
: [];
const missing = setup.modelIds.filter((id) => modelIds.length > 0 && !modelIds.includes(id));
if (modelIds.length > 0 && missing.length > 0) {
printWarning(`Verification: /models does not list configured model id(s): ${missing.join(", ")}`);
return;
}
printSuccess("Verification: endpoint reachable and authorized.");
} catch (error) {
printWarning(`Verification: failed to reach ${url}: ${error instanceof Error ? error.message : String(error)}`);
} finally {
clearTimeout(timer);
}
return;
}
if (setup.api === "anthropic-messages") {
const url = `${setup.baseUrl}/v1/models?limit=1`;
const controller = new AbortController();
const timer = setTimeout(() => controller.abort(), timeoutMs);
try {
const headers: Record<string, string> = {
"x-api-key": apiKey,
"anthropic-version": "2023-06-01",
};
if (setup.authHeader) {
headers.Authorization = `Bearer ${apiKey}`;
}
const response = await fetch(url, {
method: "GET",
headers,
signal: controller.signal,
});
if (!response.ok) {
printWarning(`Verification: ${url} returned ${response.status} ${response.statusText}`);
if (response.status === 404) {
printInfo(" Tip: For Anthropic mode, use a base URL without /v1 (e.g. https://api.anthropic.com).");
}
if ((response.status === 401 || response.status === 403) && !setup.authHeader) {
printInfo(" Tip: Some proxies require `Authorization: Bearer <apiKey>` even in Anthropic mode.");
}
return;
}
printSuccess("Verification: endpoint reachable and authorized.");
} catch (error) {
printWarning(`Verification: failed to reach ${url}: ${error instanceof Error ? error.message : String(error)}`);
} finally {
clearTimeout(timer);
}
return;
}
if (setup.api === "google-generative-ai") {
const url = `${setup.baseUrl}/v1beta/models?key=${encodeURIComponent(apiKey)}`;
const controller = new AbortController();
const timer = setTimeout(() => controller.abort(), timeoutMs);
try {
const response = await fetch(url, { method: "GET", signal: controller.signal });
if (!response.ok) {
printWarning(`Verification: ${url} returned ${response.status} ${response.statusText}`);
return;
}
printSuccess("Verification: endpoint reachable and authorized.");
} catch (error) {
printWarning(`Verification: failed to reach ${url}: ${error instanceof Error ? error.message : String(error)}`);
} finally {
clearTimeout(timer);
}
return;
}
printInfo("Verification: skipped network probe for this API mode.");
}
async function configureApiKeyProvider(authPath: string): Promise<boolean> {
const provider = await selectApiKeyProvider();
if (!provider) {
printInfo("API key setup cancelled.");
return false;
}
if (provider.id === "__custom__") {
const setup = await promptCustomProviderSetup();
if (!setup) {
printInfo("Custom provider setup cancelled.");
return false;
}
const modelsJsonPath = getModelsJsonPath(authPath);
const result = upsertProviderConfig(modelsJsonPath, setup.providerId, {
baseUrl: setup.baseUrl,
apiKey: setup.apiKeyConfig,
api: setup.api,
authHeader: setup.authHeader,
models: setup.modelIds.map((id) => ({ id })),
});
if (!result.ok) {
printWarning(result.error);
return false;
}
printSuccess(`Saved custom provider: ${setup.providerId}`);
await verifyCustomProvider(setup, authPath);
return true;
}
printSection(`API Key: ${provider.label}`);
if (provider.envVar) {
printInfo(`Tip: to avoid writing secrets to disk, set ${provider.envVar} in your shell or .env.`);
}
const apiKey = await promptText("Paste API key (leave empty to use env var instead)", "");
if (!apiKey) {
if (provider.envVar) {
printInfo(`Set ${provider.envVar} and rerun setup (or run \`feynman model list\`).`);
} else {
printInfo("No API key provided.");
}
return false;
}
AuthStorage.create(authPath).set(provider.id, { type: "api_key", key: apiKey });
printSuccess(`Saved API key for ${provider.id} in auth storage.`);
const baseUrl = await promptText("Base URL override (optional, include /v1 for OpenAI-compatible endpoints)", "");
if (baseUrl) {
const modelsJsonPath = getModelsJsonPath(authPath);
const result = upsertProviderBaseUrl(modelsJsonPath, provider.id, baseUrl);
if (result.ok) {
printSuccess(`Saved baseUrl override for ${provider.id} in models.json.`);
} else {
printWarning(result.error);
}
}
return true;
}
function resolveAvailableModelSpec(authPath: string, input: string): string | undefined { function resolveAvailableModelSpec(authPath: string, input: string): string | undefined {
const normalizedInput = input.trim().toLowerCase(); const normalizedInput = input.trim().toLowerCase();
if (!normalizedInput) { if (!normalizedInput) {
@@ -110,14 +564,46 @@ export function printModelList(settingsPath: string, authPath: string): void {
} }
} }
export async function loginModelProvider(authPath: string, providerId?: string, settingsPath?: string): Promise<void> { export async function authenticateModelProvider(authPath: string, settingsPath?: string): Promise<boolean> {
const choices = [
"API key (OpenAI, Anthropic, Google, custom provider, ...)",
"OAuth login (ChatGPT Plus/Pro, Claude Pro/Max, Copilot, ...)",
"Cancel",
];
const selection = await promptChoice("How do you want to authenticate?", choices, 0);
if (selection === 0) {
const configured = await configureApiKeyProvider(authPath);
if (configured && settingsPath) {
const currentSpec = getCurrentModelSpec(settingsPath);
const available = getAvailableModelRecords(authPath);
const currentValid = currentSpec ? available.some((m) => `${m.provider}/${m.id}` === currentSpec) : false;
if ((!currentSpec || !currentValid) && available.length > 0) {
const recommended = chooseRecommendedModel(authPath);
if (recommended) {
setDefaultModelSpec(settingsPath, authPath, recommended.spec);
}
}
}
return configured;
}
if (selection === 1) {
return loginModelProvider(authPath, undefined, settingsPath);
}
printInfo("Authentication cancelled.");
return false;
}
export async function loginModelProvider(authPath: string, providerId?: string, settingsPath?: string): Promise<boolean> {
const provider = providerId ? resolveOAuthProvider(authPath, providerId) : await selectOAuthProvider(authPath, "login"); const provider = providerId ? resolveOAuthProvider(authPath, providerId) : await selectOAuthProvider(authPath, "login");
if (!provider) { if (!provider) {
if (providerId) { if (providerId) {
throw new Error(`Unknown OAuth model provider: ${providerId}`); throw new Error(`Unknown OAuth model provider: ${providerId}`);
} }
printInfo("Login cancelled."); printInfo("Login cancelled.");
return; return false;
} }
const authStorage = AuthStorage.create(authPath); const authStorage = AuthStorage.create(authPath);
@@ -126,7 +612,13 @@ export async function loginModelProvider(authPath: string, providerId?: string,
await authStorage.login(provider.id, { await authStorage.login(provider.id, {
onAuth: (info: { url: string; instructions?: string }) => { onAuth: (info: { url: string; instructions?: string }) => {
printSection(`Login: ${provider.name ?? provider.id}`); printSection(`Login: ${provider.name ?? provider.id}`);
printInfo(`Open this URL: ${info.url}`); const opened = openUrl(info.url);
if (opened) {
printInfo("Opened the login URL in your browser.");
} else {
printWarning("Couldn't open your browser automatically.");
}
printInfo(`Auth URL: ${info.url}`);
if (info.instructions) { if (info.instructions) {
printInfo(info.instructions); printInfo(info.instructions);
} }
@@ -159,6 +651,8 @@ export async function loginModelProvider(authPath: string, providerId?: string,
} }
} }
} }
return true;
} }
export async function logoutModelProvider(authPath: string, providerId?: string): Promise<void> { export async function logoutModelProvider(authPath: string, providerId?: string): Promise<void> {
@@ -193,11 +687,34 @@ export function setDefaultModelSpec(settingsPath: string, authPath: string, spec
export async function runModelSetup(settingsPath: string, authPath: string): Promise<void> { export async function runModelSetup(settingsPath: string, authPath: string): Promise<void> {
let status = collectModelStatus(settingsPath, authPath); let status = collectModelStatus(settingsPath, authPath);
if (status.availableModels.length === 0) { while (status.availableModels.length === 0) {
await loginModelProvider(authPath, undefined, settingsPath); const choices = [
"API key (OpenAI, Anthropic, ZAI, Kimi, MiniMax, ...)",
"OAuth login (ChatGPT Plus/Pro, Claude Pro/Max, Copilot, ...)",
"Cancel",
];
const selection = await promptChoice("Choose how to configure model access:", choices, 0);
if (selection === 0) {
const configured = await configureApiKeyProvider(authPath);
if (!configured) {
status = collectModelStatus(settingsPath, authPath);
continue;
}
} else if (selection === 1) {
const loggedIn = await loginModelProvider(authPath, undefined, settingsPath);
if (!loggedIn) {
status = collectModelStatus(settingsPath, authPath);
continue;
}
} else {
printInfo("Setup cancelled.");
return;
}
status = collectModelStatus(settingsPath, authPath); status = collectModelStatus(settingsPath, authPath);
if (status.availableModels.length === 0) { if (status.availableModels.length === 0) {
return; printWarning("No authenticated models are available yet.");
printInfo("If you configured a custom provider, ensure it has `apiKey` set in models.json.");
printInfo("Tip: run `feynman doctor` to see models.json path + load errors.");
} }
} }

91
src/model/models-json.ts Normal file
View File

@@ -0,0 +1,91 @@
import { chmodSync, existsSync, mkdirSync, readFileSync, writeFileSync } from "node:fs";
import { dirname } from "node:path";
type ModelsJson = {
providers?: Record<string, Record<string, unknown>>;
};
function readModelsJson(modelsJsonPath: string): { ok: true; value: ModelsJson } | { ok: false; error: string } {
if (!existsSync(modelsJsonPath)) {
return { ok: true, value: { providers: {} } };
}
try {
const raw = readFileSync(modelsJsonPath, "utf8").trim();
if (!raw) {
return { ok: true, value: { providers: {} } };
}
const parsed = JSON.parse(raw) as unknown;
if (!parsed || typeof parsed !== "object") {
return { ok: false, error: `Invalid models.json (expected an object): ${modelsJsonPath}` };
}
return { ok: true, value: parsed as ModelsJson };
} catch (error) {
return {
ok: false,
error: `Failed to read models.json: ${error instanceof Error ? error.message : String(error)}`,
};
}
}
export function upsertProviderBaseUrl(
modelsJsonPath: string,
providerId: string,
baseUrl: string,
): { ok: true } | { ok: false; error: string } {
return upsertProviderConfig(modelsJsonPath, providerId, { baseUrl });
}
export type ProviderConfigPatch = {
baseUrl?: string;
apiKey?: string;
api?: string;
authHeader?: boolean;
headers?: Record<string, string>;
models?: Array<{ id: string }>;
};
export function upsertProviderConfig(
modelsJsonPath: string,
providerId: string,
patch: ProviderConfigPatch,
): { ok: true } | { ok: false; error: string } {
const loaded = readModelsJson(modelsJsonPath);
if (!loaded.ok) {
return loaded;
}
const value: ModelsJson = loaded.value;
const providers: Record<string, Record<string, unknown>> = {
...(value.providers && typeof value.providers === "object" ? value.providers : {}),
};
const currentProvider =
providers[providerId] && typeof providers[providerId] === "object" ? providers[providerId] : {};
const nextProvider: Record<string, unknown> = { ...currentProvider };
if (patch.baseUrl !== undefined) nextProvider.baseUrl = patch.baseUrl;
if (patch.apiKey !== undefined) nextProvider.apiKey = patch.apiKey;
if (patch.api !== undefined) nextProvider.api = patch.api;
if (patch.authHeader !== undefined) nextProvider.authHeader = patch.authHeader;
if (patch.headers !== undefined) nextProvider.headers = patch.headers;
if (patch.models !== undefined) nextProvider.models = patch.models;
providers[providerId] = nextProvider;
const next: ModelsJson = { ...value, providers };
try {
mkdirSync(dirname(modelsJsonPath), { recursive: true });
writeFileSync(modelsJsonPath, JSON.stringify(next, null, 2) + "\n", "utf8");
// models.json can contain API keys/headers; default to user-only permissions.
try {
chmodSync(modelsJsonPath, 0o600);
} catch {
// ignore permission errors (best-effort)
}
return { ok: true };
} catch (error) {
return { ok: false, error: `Failed to write models.json: ${error instanceof Error ? error.message : String(error)}` };
}
}

11
src/model/registry.ts Normal file
View File

@@ -0,0 +1,11 @@
import { dirname, resolve } from "node:path";
import { AuthStorage, ModelRegistry } from "@mariozechner/pi-coding-agent";
export function getModelsJsonPath(authPath: string): string {
return resolve(dirname(authPath), "models.json");
}
export function createModelRegistry(authPath: string): ModelRegistry {
return ModelRegistry.create(AuthStorage.create(authPath), getModelsJsonPath(authPath));
}

65
src/model/service-tier.ts Normal file
View File

@@ -0,0 +1,65 @@
import { mkdirSync, readFileSync, writeFileSync } from "node:fs";
import { dirname } from "node:path";
export const FEYNMAN_SERVICE_TIERS = [
"auto",
"default",
"flex",
"priority",
"standard_only",
] as const;
export type FeynmanServiceTier = (typeof FEYNMAN_SERVICE_TIERS)[number];
const SERVICE_TIER_SET = new Set<string>(FEYNMAN_SERVICE_TIERS);
const OPENAI_SERVICE_TIERS = new Set<FeynmanServiceTier>(["auto", "default", "flex", "priority"]);
const ANTHROPIC_SERVICE_TIERS = new Set<FeynmanServiceTier>(["auto", "standard_only"]);
function readSettings(settingsPath: string): Record<string, unknown> {
try {
return JSON.parse(readFileSync(settingsPath, "utf8")) as Record<string, unknown>;
} catch {
return {};
}
}
export function normalizeServiceTier(value: string | undefined): FeynmanServiceTier | undefined {
if (!value) return undefined;
const normalized = value.trim().toLowerCase();
return SERVICE_TIER_SET.has(normalized) ? (normalized as FeynmanServiceTier) : undefined;
}
export function getConfiguredServiceTier(settingsPath: string): FeynmanServiceTier | undefined {
const settings = readSettings(settingsPath);
return normalizeServiceTier(typeof settings.serviceTier === "string" ? settings.serviceTier : undefined);
}
export function setConfiguredServiceTier(settingsPath: string, tier: FeynmanServiceTier | undefined): void {
const settings = readSettings(settingsPath);
if (tier) {
settings.serviceTier = tier;
} else {
delete settings.serviceTier;
}
mkdirSync(dirname(settingsPath), { recursive: true });
writeFileSync(settingsPath, JSON.stringify(settings, null, 2) + "\n", "utf8");
}
export function resolveActiveServiceTier(settingsPath: string): FeynmanServiceTier | undefined {
return normalizeServiceTier(process.env.FEYNMAN_SERVICE_TIER) ?? getConfiguredServiceTier(settingsPath);
}
export function resolveProviderServiceTier(
provider: string | undefined,
tier: FeynmanServiceTier | undefined,
): FeynmanServiceTier | undefined {
if (!provider || !tier) return undefined;
if ((provider === "openai" || provider === "openai-codex") && OPENAI_SERVICE_TIERS.has(tier)) {
return tier;
}
if (provider === "anthropic" && ANTHROPIC_SERVICE_TIERS.has(tier)) {
return tier;
}
return undefined;
}

View File

@@ -1,22 +1,38 @@
import { spawn } from "node:child_process"; import { spawn } from "node:child_process";
import { existsSync } from "node:fs"; import { existsSync } from "node:fs";
import { constants } from "node:os";
import { buildPiArgs, buildPiEnv, type PiRuntimeOptions, resolvePiPaths } from "./runtime.js"; import { buildPiArgs, buildPiEnv, type PiRuntimeOptions, resolvePiPaths, toNodeImportSpecifier } from "./runtime.js";
import { ensureSupportedNodeVersion } from "../system/node-version.js";
export function exitCodeFromSignal(signal: NodeJS.Signals): number {
const signalNumber = constants.signals[signal];
return typeof signalNumber === "number" ? 128 + signalNumber : 1;
}
export async function launchPiChat(options: PiRuntimeOptions): Promise<void> { export async function launchPiChat(options: PiRuntimeOptions): Promise<void> {
const { piCliPath, promisePolyfillPath } = resolvePiPaths(options.appRoot); ensureSupportedNodeVersion();
const { piCliPath, promisePolyfillPath, promisePolyfillSourcePath, tsxLoaderPath } = resolvePiPaths(options.appRoot);
if (!existsSync(piCliPath)) { if (!existsSync(piCliPath)) {
throw new Error(`Pi CLI not found: ${piCliPath}`); throw new Error(`Pi CLI not found: ${piCliPath}`);
} }
if (!existsSync(promisePolyfillPath)) {
const useBuiltPolyfill = existsSync(promisePolyfillPath);
const useDevPolyfill = !useBuiltPolyfill && existsSync(promisePolyfillSourcePath) && existsSync(tsxLoaderPath);
if (!useBuiltPolyfill && !useDevPolyfill) {
throw new Error(`Promise polyfill not found: ${promisePolyfillPath}`); throw new Error(`Promise polyfill not found: ${promisePolyfillPath}`);
} }
if (process.stdout.isTTY) { if (process.stdout.isTTY && options.mode !== "rpc") {
process.stdout.write("\x1b[2J\x1b[3J\x1b[H"); process.stdout.write("\x1b[2J\x1b[3J\x1b[H");
} }
const child = spawn(process.execPath, ["--import", promisePolyfillPath, piCliPath, ...buildPiArgs(options)], { const importArgs = useDevPolyfill
? ["--import", toNodeImportSpecifier(tsxLoaderPath), "--import", toNodeImportSpecifier(promisePolyfillSourcePath)]
: ["--import", toNodeImportSpecifier(promisePolyfillPath)];
const child = spawn(process.execPath, [...importArgs, piCliPath, ...buildPiArgs(options)], {
cwd: options.workingDir, cwd: options.workingDir,
stdio: "inherit", stdio: "inherit",
env: buildPiEnv(options), env: buildPiEnv(options),
@@ -26,7 +42,9 @@ export async function launchPiChat(options: PiRuntimeOptions): Promise<void> {
child.on("error", reject); child.on("error", reject);
child.on("exit", (code, signal) => { child.on("exit", (code, signal) => {
if (signal) { if (signal) {
process.kill(process.pid, signal); console.error(`feynman terminated because the Pi child exited with ${signal}.`);
process.exitCode = exitCodeFromSignal(signal);
resolvePromise();
return; return;
} }
process.exitCode = code ?? 0; process.exitCode = code ?? 0;

View File

@@ -1,6 +1,7 @@
import type { PackageSource } from "@mariozechner/pi-coding-agent"; import type { PackageSource } from "@mariozechner/pi-coding-agent";
export const CORE_PACKAGE_SOURCES = [ export const CORE_PACKAGE_SOURCES = [
"npm:@companion-ai/alpha-hub",
"npm:pi-subagents", "npm:pi-subagents",
"npm:pi-btw", "npm:pi-btw",
"npm:pi-docparser", "npm:pi-docparser",

View File

@@ -1,5 +1,6 @@
import { existsSync, readFileSync } from "node:fs"; import { existsSync, readFileSync } from "node:fs";
import { dirname, resolve } from "node:path"; import { delimiter, dirname, isAbsolute, resolve } from "node:path";
import { pathToFileURL } from "node:url";
import { import {
BROWSER_FALLBACK_PATHS, BROWSER_FALLBACK_PATHS,
@@ -14,17 +15,32 @@ export type PiRuntimeOptions = {
sessionDir: string; sessionDir: string;
feynmanAgentDir: string; feynmanAgentDir: string;
feynmanVersion?: string; feynmanVersion?: string;
mode?: "text" | "json" | "rpc";
thinkingLevel?: string; thinkingLevel?: string;
explicitModelSpec?: string; explicitModelSpec?: string;
oneShotPrompt?: string; oneShotPrompt?: string;
initialPrompt?: string; initialPrompt?: string;
}; };
export function getFeynmanNpmPrefixPath(feynmanAgentDir: string): string {
return resolve(dirname(feynmanAgentDir), "npm-global");
}
export function applyFeynmanPackageManagerEnv(feynmanAgentDir: string): string {
const feynmanNpmPrefixPath = getFeynmanNpmPrefixPath(feynmanAgentDir);
process.env.FEYNMAN_NPM_PREFIX = feynmanNpmPrefixPath;
process.env.NPM_CONFIG_PREFIX = feynmanNpmPrefixPath;
process.env.npm_config_prefix = feynmanNpmPrefixPath;
return feynmanNpmPrefixPath;
}
export function resolvePiPaths(appRoot: string) { export function resolvePiPaths(appRoot: string) {
return { return {
piPackageRoot: resolve(appRoot, "node_modules", "@mariozechner", "pi-coding-agent"), piPackageRoot: resolve(appRoot, "node_modules", "@mariozechner", "pi-coding-agent"),
piCliPath: resolve(appRoot, "node_modules", "@mariozechner", "pi-coding-agent", "dist", "cli.js"), piCliPath: resolve(appRoot, "node_modules", "@mariozechner", "pi-coding-agent", "dist", "cli.js"),
promisePolyfillPath: resolve(appRoot, "dist", "system", "promise-polyfill.js"), promisePolyfillPath: resolve(appRoot, "dist", "system", "promise-polyfill.js"),
promisePolyfillSourcePath: resolve(appRoot, "src", "system", "promise-polyfill.ts"),
tsxLoaderPath: resolve(appRoot, "node_modules", "tsx", "dist", "loader.mjs"),
researchToolsPath: resolve(appRoot, "extensions", "research-tools.ts"), researchToolsPath: resolve(appRoot, "extensions", "research-tools.ts"),
promptTemplatePath: resolve(appRoot, "prompts"), promptTemplatePath: resolve(appRoot, "prompts"),
systemPromptPath: resolve(appRoot, ".feynman", "SYSTEM.md"), systemPromptPath: resolve(appRoot, ".feynman", "SYSTEM.md"),
@@ -33,12 +49,20 @@ export function resolvePiPaths(appRoot: string) {
}; };
} }
export function toNodeImportSpecifier(modulePath: string): string {
return isAbsolute(modulePath) ? pathToFileURL(modulePath).href : modulePath;
}
export function validatePiInstallation(appRoot: string): string[] { export function validatePiInstallation(appRoot: string): string[] {
const paths = resolvePiPaths(appRoot); const paths = resolvePiPaths(appRoot);
const missing: string[] = []; const missing: string[] = [];
if (!existsSync(paths.piCliPath)) missing.push(paths.piCliPath); if (!existsSync(paths.piCliPath)) missing.push(paths.piCliPath);
if (!existsSync(paths.promisePolyfillPath)) missing.push(paths.promisePolyfillPath); if (!existsSync(paths.promisePolyfillPath)) {
// Dev fallback: allow running from source without `dist/` build artifacts.
const hasDevPolyfill = existsSync(paths.promisePolyfillSourcePath) && existsSync(paths.tsxLoaderPath);
if (!hasDevPolyfill) missing.push(paths.promisePolyfillPath);
}
if (!existsSync(paths.researchToolsPath)) missing.push(paths.researchToolsPath); if (!existsSync(paths.researchToolsPath)) missing.push(paths.researchToolsPath);
if (!existsSync(paths.promptTemplatePath)) missing.push(paths.promptTemplatePath); if (!existsSync(paths.promptTemplatePath)) missing.push(paths.promptTemplatePath);
@@ -60,6 +84,9 @@ export function buildPiArgs(options: PiRuntimeOptions): string[] {
args.push("--system-prompt", readFileSync(paths.systemPromptPath, "utf8")); args.push("--system-prompt", readFileSync(paths.systemPromptPath, "utf8"));
} }
if (options.mode) {
args.push("--mode", options.mode);
}
if (options.explicitModelSpec) { if (options.explicitModelSpec) {
args.push("--model", options.explicitModelSpec); args.push("--model", options.explicitModelSpec);
} }
@@ -77,23 +104,36 @@ export function buildPiArgs(options: PiRuntimeOptions): string[] {
export function buildPiEnv(options: PiRuntimeOptions): NodeJS.ProcessEnv { export function buildPiEnv(options: PiRuntimeOptions): NodeJS.ProcessEnv {
const paths = resolvePiPaths(options.appRoot); const paths = resolvePiPaths(options.appRoot);
const feynmanNpmPrefixPath = getFeynmanNpmPrefixPath(options.feynmanAgentDir);
const feynmanNpmBinPath = resolve(feynmanNpmPrefixPath, "bin");
const feynmanWebSearchConfigPath = resolve(dirname(options.feynmanAgentDir), "web-search.json");
const currentPath = process.env.PATH ?? ""; const currentPath = process.env.PATH ?? "";
const binPath = paths.nodeModulesBinPath; const binEntries = [paths.nodeModulesBinPath, resolve(paths.piWorkspaceNodeModulesPath, ".bin"), feynmanNpmBinPath];
const binPath = binEntries.join(delimiter);
return { return {
...process.env, ...process.env,
PATH: `${binPath}:${currentPath}`, PATH: `${binPath}${delimiter}${currentPath}`,
FEYNMAN_VERSION: options.feynmanVersion, FEYNMAN_VERSION: options.feynmanVersion,
FEYNMAN_SESSION_DIR: options.sessionDir, FEYNMAN_SESSION_DIR: options.sessionDir,
FEYNMAN_MEMORY_DIR: resolve(dirname(options.feynmanAgentDir), "memory"), FEYNMAN_MEMORY_DIR: resolve(dirname(options.feynmanAgentDir), "memory"),
FEYNMAN_WEB_SEARCH_CONFIG: feynmanWebSearchConfigPath,
FEYNMAN_NODE_EXECUTABLE: process.execPath, FEYNMAN_NODE_EXECUTABLE: process.execPath,
FEYNMAN_BIN_PATH: resolve(options.appRoot, "bin", "feynman.js"), FEYNMAN_BIN_PATH: resolve(options.appRoot, "bin", "feynman.js"),
FEYNMAN_NPM_PREFIX: feynmanNpmPrefixPath,
// Ensure the Pi child process uses Feynman's agent dir for auth/models/settings.
PI_CODING_AGENT_DIR: options.feynmanAgentDir,
PANDOC_PATH: process.env.PANDOC_PATH ?? resolveExecutable("pandoc", PANDOC_FALLBACK_PATHS), PANDOC_PATH: process.env.PANDOC_PATH ?? resolveExecutable("pandoc", PANDOC_FALLBACK_PATHS),
PI_HARDWARE_CURSOR: process.env.PI_HARDWARE_CURSOR ?? "1", PI_HARDWARE_CURSOR: process.env.PI_HARDWARE_CURSOR ?? "1",
PI_SKIP_VERSION_CHECK: process.env.PI_SKIP_VERSION_CHECK ?? "1", PI_SKIP_VERSION_CHECK: process.env.PI_SKIP_VERSION_CHECK ?? "1",
MERMAID_CLI_PATH: process.env.MERMAID_CLI_PATH ?? resolveExecutable("mmdc", MERMAID_FALLBACK_PATHS), MERMAID_CLI_PATH: process.env.MERMAID_CLI_PATH ?? resolveExecutable("mmdc", MERMAID_FALLBACK_PATHS),
PUPPETEER_EXECUTABLE_PATH: PUPPETEER_EXECUTABLE_PATH:
process.env.PUPPETEER_EXECUTABLE_PATH ?? resolveExecutable("google-chrome", BROWSER_FALLBACK_PATHS), process.env.PUPPETEER_EXECUTABLE_PATH ?? resolveExecutable("google-chrome", BROWSER_FALLBACK_PATHS),
// Always pin npm's global prefix to the Feynman workspace. npm injects
// lowercase config vars into child processes, which would otherwise leak
// the caller's global prefix into Pi.
NPM_CONFIG_PREFIX: feynmanNpmPrefixPath,
npm_config_prefix: feynmanNpmPrefixPath,
}; };
} }

View File

@@ -1,9 +1,10 @@
import { existsSync, mkdirSync, readFileSync, writeFileSync } from "node:fs"; import { existsSync, mkdirSync, readFileSync, writeFileSync } from "node:fs";
import { dirname } from "node:path"; import { dirname } from "node:path";
import { AuthStorage, ModelRegistry, type PackageSource } from "@mariozechner/pi-coding-agent"; import { ModelRegistry, type PackageSource } from "@mariozechner/pi-coding-agent";
import { CORE_PACKAGE_SOURCES, shouldPruneLegacyDefaultPackages } from "./package-presets.js"; import { CORE_PACKAGE_SOURCES, shouldPruneLegacyDefaultPackages } from "./package-presets.js";
import { createModelRegistry } from "../model/registry.js";
export type ThinkingLevel = "off" | "minimal" | "low" | "medium" | "high" | "xhigh"; export type ThinkingLevel = "off" | "minimal" | "low" | "medium" | "high" | "xhigh";
@@ -115,8 +116,7 @@ export function normalizeFeynmanSettings(
settings.packages = [...CORE_PACKAGE_SOURCES]; settings.packages = [...CORE_PACKAGE_SOURCES];
} }
const authStorage = AuthStorage.create(authPath); const modelRegistry = createModelRegistry(authPath);
const modelRegistry = new ModelRegistry(authStorage);
const availableModels = modelRegistry.getAvailable().map((model) => ({ const availableModels = modelRegistry.getAvailable().map((model) => ({
provider: model.provider, provider: model.provider,
id: model.id, id: model.id,

View File

@@ -1,13 +1,15 @@
import { existsSync, readFileSync } from "node:fs"; import { existsSync, mkdirSync, readFileSync, writeFileSync } from "node:fs";
import { homedir } from "node:os"; import { dirname, resolve } from "node:path";
import { resolve } from "node:path"; import { getFeynmanHome } from "../config/paths.js";
export type PiWebSearchProvider = "auto" | "perplexity" | "gemini"; export type PiWebSearchProvider = "auto" | "perplexity" | "exa" | "gemini";
export type PiWebAccessConfig = Record<string, unknown> & { export type PiWebAccessConfig = Record<string, unknown> & {
route?: PiWebSearchProvider;
provider?: PiWebSearchProvider; provider?: PiWebSearchProvider;
searchProvider?: PiWebSearchProvider; searchProvider?: PiWebSearchProvider;
perplexityApiKey?: string; perplexityApiKey?: string;
exaApiKey?: string;
geminiApiKey?: string; geminiApiKey?: string;
chromeProfile?: string; chromeProfile?: string;
}; };
@@ -17,18 +19,20 @@ export type PiWebAccessStatus = {
searchProvider: PiWebSearchProvider; searchProvider: PiWebSearchProvider;
requestProvider: PiWebSearchProvider; requestProvider: PiWebSearchProvider;
perplexityConfigured: boolean; perplexityConfigured: boolean;
exaConfigured: boolean;
geminiApiConfigured: boolean; geminiApiConfigured: boolean;
chromeProfile?: string; chromeProfile?: string;
routeLabel: string; routeLabel: string;
note: string; note: string;
}; };
export function getPiWebSearchConfigPath(home = process.env.HOME ?? homedir()): string { export function getPiWebSearchConfigPath(home?: string): string {
return resolve(home, ".feynman", "web-search.json"); const feynmanHome = home ? resolve(home, ".feynman") : getFeynmanHome();
return resolve(feynmanHome, "web-search.json");
} }
function normalizeProvider(value: unknown): PiWebSearchProvider | undefined { function normalizeProvider(value: unknown): PiWebSearchProvider | undefined {
return value === "auto" || value === "perplexity" || value === "gemini" ? value : undefined; return value === "auto" || value === "perplexity" || value === "exa" || value === "gemini" ? value : undefined;
} }
function normalizeNonEmptyString(value: unknown): string | undefined { function normalizeNonEmptyString(value: unknown): string | undefined {
@@ -48,10 +52,29 @@ export function loadPiWebAccessConfig(configPath = getPiWebSearchConfigPath()):
} }
} }
export function savePiWebAccessConfig(
updates: Partial<Record<keyof PiWebAccessConfig, unknown>>,
configPath = getPiWebSearchConfigPath(),
): void {
const merged: Record<string, unknown> = { ...loadPiWebAccessConfig(configPath) };
for (const [key, value] of Object.entries(updates)) {
if (value === undefined) {
delete merged[key];
} else {
merged[key] = value;
}
}
mkdirSync(dirname(configPath), { recursive: true });
writeFileSync(configPath, JSON.stringify(merged, null, 2) + "\n", "utf8");
}
function formatRouteLabel(provider: PiWebSearchProvider): string { function formatRouteLabel(provider: PiWebSearchProvider): string {
switch (provider) { switch (provider) {
case "perplexity": case "perplexity":
return "Perplexity"; return "Perplexity";
case "exa":
return "Exa";
case "gemini": case "gemini":
return "Gemini"; return "Gemini";
default: default:
@@ -63,10 +86,12 @@ function formatRouteNote(provider: PiWebSearchProvider): string {
switch (provider) { switch (provider) {
case "perplexity": case "perplexity":
return "Pi web-access will use Perplexity for search."; return "Pi web-access will use Perplexity for search.";
case "exa":
return "Pi web-access will use Exa for search.";
case "gemini": case "gemini":
return "Pi web-access will use Gemini API or Gemini Browser."; return "Pi web-access will use Gemini API or Gemini Browser.";
default: default:
return "Pi web-access will try Perplexity, then Gemini API, then Gemini Browser."; return "Pi web-access will try Perplexity, then Exa, then Gemini API, then Gemini Browser.";
} }
} }
@@ -74,9 +99,11 @@ export function getPiWebAccessStatus(
config: PiWebAccessConfig = loadPiWebAccessConfig(), config: PiWebAccessConfig = loadPiWebAccessConfig(),
configPath = getPiWebSearchConfigPath(), configPath = getPiWebSearchConfigPath(),
): PiWebAccessStatus { ): PiWebAccessStatus {
const searchProvider = normalizeProvider(config.searchProvider) ?? "auto"; const searchProvider =
const requestProvider = normalizeProvider(config.provider) ?? searchProvider; normalizeProvider(config.searchProvider) ?? normalizeProvider(config.route) ?? normalizeProvider(config.provider) ?? "auto";
const requestProvider = normalizeProvider(config.provider) ?? normalizeProvider(config.route) ?? searchProvider;
const perplexityConfigured = Boolean(normalizeNonEmptyString(config.perplexityApiKey)); const perplexityConfigured = Boolean(normalizeNonEmptyString(config.perplexityApiKey));
const exaConfigured = Boolean(normalizeNonEmptyString(config.exaApiKey));
const geminiApiConfigured = Boolean(normalizeNonEmptyString(config.geminiApiKey)); const geminiApiConfigured = Boolean(normalizeNonEmptyString(config.geminiApiKey));
const chromeProfile = normalizeNonEmptyString(config.chromeProfile); const chromeProfile = normalizeNonEmptyString(config.chromeProfile);
const effectiveProvider = searchProvider; const effectiveProvider = searchProvider;
@@ -86,6 +113,7 @@ export function getPiWebAccessStatus(
searchProvider, searchProvider,
requestProvider, requestProvider,
perplexityConfigured, perplexityConfigured,
exaConfigured,
geminiApiConfigured, geminiApiConfigured,
chromeProfile, chromeProfile,
routeLabel: formatRouteLabel(effectiveProvider), routeLabel: formatRouteLabel(effectiveProvider),
@@ -101,6 +129,7 @@ export function formatPiWebAccessDoctorLines(
` search route: ${status.routeLabel}`, ` search route: ${status.routeLabel}`,
` request route: ${status.requestProvider}`, ` request route: ${status.requestProvider}`,
` perplexity api: ${status.perplexityConfigured ? "configured" : "not configured"}`, ` perplexity api: ${status.perplexityConfigured ? "configured" : "not configured"}`,
` exa api: ${status.exaConfigured ? "configured" : "not configured"}`,
` gemini api: ${status.geminiApiConfigured ? "configured" : "not configured"}`, ` gemini api: ${status.geminiApiConfigured ? "configured" : "not configured"}`,
` browser profile: ${status.chromeProfile ?? "default Chromium profile"}`, ` browser profile: ${status.chromeProfile ?? "default Chromium profile"}`,
` config path: ${status.configPath}`, ` config path: ${status.configPath}`,

View File

@@ -1,13 +1,58 @@
import { getPiWebAccessStatus } from "../pi/web-access.js"; import {
getPiWebAccessStatus,
savePiWebAccessConfig,
type PiWebAccessConfig,
type PiWebSearchProvider,
} from "../pi/web-access.js";
import { printInfo } from "../ui/terminal.js"; import { printInfo } from "../ui/terminal.js";
const SEARCH_PROVIDERS: PiWebSearchProvider[] = ["auto", "perplexity", "exa", "gemini"];
const PROVIDER_API_KEY_FIELDS: Partial<Record<PiWebSearchProvider, keyof PiWebAccessConfig>> = {
perplexity: "perplexityApiKey",
exa: "exaApiKey",
gemini: "geminiApiKey",
};
export function printSearchStatus(): void { export function printSearchStatus(): void {
const status = getPiWebAccessStatus(); const status = getPiWebAccessStatus();
printInfo("Managed by: pi-web-access"); printInfo("Managed by: pi-web-access");
printInfo(`Search route: ${status.routeLabel}`); printInfo(`Search route: ${status.routeLabel}`);
printInfo(`Request route: ${status.requestProvider}`); printInfo(`Request route: ${status.requestProvider}`);
printInfo(`Perplexity API configured: ${status.perplexityConfigured ? "yes" : "no"}`); printInfo(`Perplexity API configured: ${status.perplexityConfigured ? "yes" : "no"}`);
printInfo(`Exa API configured: ${status.exaConfigured ? "yes" : "no"}`);
printInfo(`Gemini API configured: ${status.geminiApiConfigured ? "yes" : "no"}`); printInfo(`Gemini API configured: ${status.geminiApiConfigured ? "yes" : "no"}`);
printInfo(`Browser profile: ${status.chromeProfile ?? "default Chromium profile"}`); printInfo(`Browser profile: ${status.chromeProfile ?? "default Chromium profile"}`);
printInfo(`Config path: ${status.configPath}`); printInfo(`Config path: ${status.configPath}`);
} }
export function setSearchProvider(provider: PiWebSearchProvider, apiKey?: string): void {
if (!SEARCH_PROVIDERS.includes(provider)) {
throw new Error(`Usage: feynman search set <${SEARCH_PROVIDERS.join("|")}> [api-key]`);
}
if (apiKey !== undefined && provider === "auto") {
throw new Error("The auto provider does not use an API key. Usage: feynman search set auto");
}
const updates: Partial<Record<keyof PiWebAccessConfig, unknown>> = {
provider,
searchProvider: provider,
route: undefined,
};
const apiKeyField = PROVIDER_API_KEY_FIELDS[provider];
if (apiKeyField && apiKey !== undefined) {
updates[apiKeyField] = apiKey;
}
savePiWebAccessConfig(updates);
const status = getPiWebAccessStatus();
console.log(`Web search provider set to ${status.routeLabel}.`);
console.log(`Config path: ${status.configPath}`);
}
export function clearSearchConfig(): void {
savePiWebAccessConfig({ provider: undefined, searchProvider: undefined, route: undefined });
const status = getPiWebAccessStatus();
console.log(`Web search provider reset to ${status.routeLabel}.`);
console.log(`Config path: ${status.configPath}`);
}

View File

@@ -1,6 +1,7 @@
import { AuthStorage, ModelRegistry } from "@mariozechner/pi-coding-agent";
import { getUserName as getAlphaUserName, isLoggedIn as isAlphaLoggedIn } from "@companion-ai/alpha-hub/lib"; import { getUserName as getAlphaUserName, isLoggedIn as isAlphaLoggedIn } from "@companion-ai/alpha-hub/lib";
import { readFileSync } from "node:fs";
import { formatPiWebAccessDoctorLines, getPiWebAccessStatus } from "../pi/web-access.js"; import { formatPiWebAccessDoctorLines, getPiWebAccessStatus } from "../pi/web-access.js";
import { BROWSER_FALLBACK_PATHS, PANDOC_FALLBACK_PATHS, resolveExecutable } from "../system/executables.js"; import { BROWSER_FALLBACK_PATHS, PANDOC_FALLBACK_PATHS, resolveExecutable } from "../system/executables.js";
import { readJson } from "../pi/settings.js"; import { readJson } from "../pi/settings.js";
@@ -8,6 +9,31 @@ import { validatePiInstallation } from "../pi/runtime.js";
import { printInfo, printPanel, printSection } from "../ui/terminal.js"; import { printInfo, printPanel, printSection } from "../ui/terminal.js";
import { getCurrentModelSpec } from "../model/commands.js"; import { getCurrentModelSpec } from "../model/commands.js";
import { buildModelStatusSnapshotFromRecords, getAvailableModelRecords, getSupportedModelRecords } from "../model/catalog.js"; import { buildModelStatusSnapshotFromRecords, getAvailableModelRecords, getSupportedModelRecords } from "../model/catalog.js";
import { createModelRegistry, getModelsJsonPath } from "../model/registry.js";
import { getConfiguredServiceTier } from "../model/service-tier.js";
function findProvidersMissingApiKey(modelsJsonPath: string): string[] {
try {
const raw = readFileSync(modelsJsonPath, "utf8").trim();
if (!raw) return [];
const parsed = JSON.parse(raw) as any;
const providers = parsed?.providers;
if (!providers || typeof providers !== "object") return [];
const missing: string[] = [];
for (const [providerId, config] of Object.entries(providers as Record<string, unknown>)) {
if (!config || typeof config !== "object") continue;
const models = (config as any).models;
if (!Array.isArray(models) || models.length === 0) continue;
const apiKey = (config as any).apiKey;
if (typeof apiKey !== "string" || apiKey.trim().length === 0) {
missing.push(providerId);
}
}
return missing;
} catch {
return [];
}
}
export type DoctorOptions = { export type DoctorOptions = {
settingsPath: string; settingsPath: string;
@@ -80,6 +106,7 @@ export function runStatus(options: DoctorOptions): void {
printInfo(`Recommended model: ${snapshot.recommendedModel ?? "not available"}`); printInfo(`Recommended model: ${snapshot.recommendedModel ?? "not available"}`);
printInfo(`alphaXiv: ${snapshot.alphaLoggedIn ? snapshot.alphaUser ?? "configured" : "not configured"}`); printInfo(`alphaXiv: ${snapshot.alphaLoggedIn ? snapshot.alphaUser ?? "configured" : "not configured"}`);
printInfo(`Web access: pi-web-access (${snapshot.webRouteLabel})`); printInfo(`Web access: pi-web-access (${snapshot.webRouteLabel})`);
printInfo(`Service tier: ${getConfiguredServiceTier(options.settingsPath) ?? "not set"}`);
printInfo(`Preview: ${snapshot.previewConfigured ? "configured" : "not configured"}`); printInfo(`Preview: ${snapshot.previewConfigured ? "configured" : "not configured"}`);
printSection("Paths"); printSection("Paths");
@@ -104,7 +131,7 @@ export function runStatus(options: DoctorOptions): void {
export function runDoctor(options: DoctorOptions): void { export function runDoctor(options: DoctorOptions): void {
const settings = readJson(options.settingsPath); const settings = readJson(options.settingsPath);
const modelRegistry = new ModelRegistry(AuthStorage.create(options.authPath)); const modelRegistry = createModelRegistry(options.authPath);
const availableModels = modelRegistry.getAvailable(); const availableModels = modelRegistry.getAvailable();
const pandocPath = resolveExecutable("pandoc", PANDOC_FALLBACK_PATHS); const pandocPath = resolveExecutable("pandoc", PANDOC_FALLBACK_PATHS);
const browserPath = process.env.PUPPETEER_EXECUTABLE_PATH ?? resolveExecutable("google-chrome", BROWSER_FALLBACK_PATHS); const browserPath = process.env.PUPPETEER_EXECUTABLE_PATH ?? resolveExecutable("google-chrome", BROWSER_FALLBACK_PATHS);
@@ -140,10 +167,26 @@ export function runDoctor(options: DoctorOptions): void {
console.log(`default model valid: ${modelStatus.modelValid ? "yes" : "no"}`); console.log(`default model valid: ${modelStatus.modelValid ? "yes" : "no"}`);
console.log(`authenticated providers: ${modelStatus.authenticatedProviderCount}`); console.log(`authenticated providers: ${modelStatus.authenticatedProviderCount}`);
console.log(`authenticated models: ${modelStatus.authenticatedModelCount}`); console.log(`authenticated models: ${modelStatus.authenticatedModelCount}`);
console.log(`service tier: ${getConfiguredServiceTier(options.settingsPath) ?? "not set"}`);
console.log(`recommended model: ${modelStatus.recommendedModel ?? "not available"}`); console.log(`recommended model: ${modelStatus.recommendedModel ?? "not available"}`);
if (modelStatus.recommendedModelReason) { if (modelStatus.recommendedModelReason) {
console.log(` why: ${modelStatus.recommendedModelReason}`); console.log(` why: ${modelStatus.recommendedModelReason}`);
} }
const modelsError = modelRegistry.getError();
if (modelsError) {
console.log("models.json: error");
for (const line of modelsError.split("\n")) {
console.log(` ${line}`);
}
} else {
const modelsJsonPath = getModelsJsonPath(options.authPath);
console.log(`models.json: ${modelsJsonPath}`);
const missingApiKeyProviders = findProvidersMissingApiKey(modelsJsonPath);
if (missingApiKeyProviders.length > 0) {
console.log(` warning: provider(s) missing apiKey: ${missingApiKeyProviders.join(", ")}`);
console.log(" note: custom providers with a models[] list need apiKey in models.json to be available.");
}
}
console.log(`pandoc: ${pandocPath ?? "missing"}`); console.log(`pandoc: ${pandocPath ?? "missing"}`);
console.log(`browser preview runtime: ${browserPath ?? "missing"}`); console.log(`browser preview runtime: ${browserPath ?? "missing"}`);
for (const line of formatPiWebAccessDoctorLines()) { for (const line of formatPiWebAccessDoctorLines()) {

View File

@@ -13,13 +13,35 @@ export function setupPreviewDependencies(): PreviewSetupResult {
return { status: "ready", message: `pandoc already installed at ${pandocPath}` }; return { status: "ready", message: `pandoc already installed at ${pandocPath}` };
} }
const brewPath = resolveExecutable("brew", BREW_FALLBACK_PATHS); if (process.platform === "darwin") {
if (process.platform === "darwin" && brewPath) { const brewPath = resolveExecutable("brew", BREW_FALLBACK_PATHS);
const result = spawnSync(brewPath, ["install", "pandoc"], { stdio: "inherit" }); if (brewPath) {
if (result.status !== 0) { const result = spawnSync(brewPath, ["install", "pandoc"], { stdio: "inherit" });
throw new Error("Failed to install pandoc via Homebrew."); if (result.status !== 0) {
throw new Error("Failed to install pandoc via Homebrew.");
}
return { status: "installed", message: "Preview dependency installed: pandoc" };
}
}
if (process.platform === "win32") {
const wingetPath = resolveExecutable("winget");
if (wingetPath) {
const result = spawnSync(wingetPath, ["install", "--id", "JohnMacFarlane.Pandoc", "-e"], { stdio: "inherit" });
if (result.status === 0) {
return { status: "installed", message: "Preview dependency installed: pandoc (via winget)" };
}
}
}
if (process.platform === "linux") {
const aptPath = resolveExecutable("apt-get");
if (aptPath) {
const result = spawnSync(aptPath, ["install", "-y", "pandoc"], { stdio: "inherit" });
if (result.status === 0) {
return { status: "installed", message: "Preview dependency installed: pandoc (via apt)" };
}
} }
return { status: "installed", message: "Preview dependency installed: pandoc" };
} }
return { return {

View File

@@ -29,6 +29,7 @@ function printNonInteractiveSetupGuidance(): void {
printInfo("Non-interactive terminal. Use explicit commands:"); printInfo("Non-interactive terminal. Use explicit commands:");
printInfo(" feynman model login <provider>"); printInfo(" feynman model login <provider>");
printInfo(" feynman model set <provider/model>"); printInfo(" feynman model set <provider/model>");
printInfo(" # or configure API keys via env vars/auth.json and rerun `feynman model list`");
printInfo(" feynman alpha login"); printInfo(" feynman alpha login");
printInfo(" feynman doctor"); printInfo(" feynman doctor");
} }

View File

@@ -1,27 +1,36 @@
import { spawnSync } from "node:child_process"; import { spawnSync } from "node:child_process";
import { existsSync } from "node:fs"; import { existsSync } from "node:fs";
export const PANDOC_FALLBACK_PATHS = [ const isWindows = process.platform === "win32";
"/opt/homebrew/bin/pandoc", const programFiles = process.env.PROGRAMFILES ?? "C:\\Program Files";
"/usr/local/bin/pandoc", const localAppData = process.env.LOCALAPPDATA ?? "";
];
export const BREW_FALLBACK_PATHS = [ export const PANDOC_FALLBACK_PATHS = isWindows
"/opt/homebrew/bin/brew", ? [`${programFiles}\\Pandoc\\pandoc.exe`]
"/usr/local/bin/brew", : ["/opt/homebrew/bin/pandoc", "/usr/local/bin/pandoc"];
];
export const BROWSER_FALLBACK_PATHS = [ export const BREW_FALLBACK_PATHS = isWindows
"/Applications/Google Chrome.app/Contents/MacOS/Google Chrome", ? []
"/Applications/Chromium.app/Contents/MacOS/Chromium", : ["/opt/homebrew/bin/brew", "/usr/local/bin/brew"];
"/Applications/Brave Browser.app/Contents/MacOS/Brave Browser",
"/Applications/Microsoft Edge.app/Contents/MacOS/Microsoft Edge",
];
export const MERMAID_FALLBACK_PATHS = [ export const BROWSER_FALLBACK_PATHS = isWindows
"/opt/homebrew/bin/mmdc", ? [
"/usr/local/bin/mmdc", `${programFiles}\\Google\\Chrome\\Application\\chrome.exe`,
]; `${programFiles} (x86)\\Google\\Chrome\\Application\\chrome.exe`,
`${localAppData}\\Google\\Chrome\\Application\\chrome.exe`,
`${programFiles}\\Microsoft\\Edge\\Application\\msedge.exe`,
`${programFiles}\\BraveSoftware\\Brave-Browser\\Application\\brave.exe`,
]
: [
"/Applications/Google Chrome.app/Contents/MacOS/Google Chrome",
"/Applications/Chromium.app/Contents/MacOS/Chromium",
"/Applications/Brave Browser.app/Contents/MacOS/Brave Browser",
"/Applications/Microsoft Edge.app/Contents/MacOS/Microsoft Edge",
];
export const MERMAID_FALLBACK_PATHS = isWindows
? []
: ["/opt/homebrew/bin/mmdc", "/usr/local/bin/mmdc"];
export function resolveExecutable(name: string, fallbackPaths: string[] = []): string | undefined { export function resolveExecutable(name: string, fallbackPaths: string[] = []): string | undefined {
for (const candidate of fallbackPaths) { for (const candidate of fallbackPaths) {
@@ -30,13 +39,19 @@ export function resolveExecutable(name: string, fallbackPaths: string[] = []): s
} }
} }
const result = spawnSync("sh", ["-lc", `command -v ${name}`], { const isWindows = process.platform === "win32";
encoding: "utf8", const result = isWindows
stdio: ["ignore", "pipe", "ignore"], ? spawnSync("cmd", ["/c", `where ${name}`], {
}); encoding: "utf8",
stdio: ["ignore", "pipe", "ignore"],
})
: spawnSync("sh", ["-lc", `command -v ${name}`], {
encoding: "utf8",
stdio: ["ignore", "pipe", "ignore"],
});
if (result.status === 0) { if (result.status === 0) {
const resolved = result.stdout.trim(); const resolved = result.stdout.trim().split(/\r?\n/)[0];
if (resolved) { if (resolved) {
return resolved; return resolved;
} }

View File

@@ -0,0 +1,45 @@
export const MIN_NODE_VERSION = "20.19.0";
type ParsedNodeVersion = {
major: number;
minor: number;
patch: number;
};
function parseNodeVersion(version: string): ParsedNodeVersion {
const [major = "0", minor = "0", patch = "0"] = version.replace(/^v/, "").split(".");
return {
major: Number.parseInt(major, 10) || 0,
minor: Number.parseInt(minor, 10) || 0,
patch: Number.parseInt(patch, 10) || 0,
};
}
function compareNodeVersions(left: ParsedNodeVersion, right: ParsedNodeVersion): number {
if (left.major !== right.major) return left.major - right.major;
if (left.minor !== right.minor) return left.minor - right.minor;
return left.patch - right.patch;
}
export function isSupportedNodeVersion(version = process.versions.node): boolean {
return compareNodeVersions(parseNodeVersion(version), parseNodeVersion(MIN_NODE_VERSION)) >= 0;
}
export function getUnsupportedNodeVersionLines(version = process.versions.node): string[] {
const isWindows = process.platform === "win32";
return [
`feynman requires Node.js ${MIN_NODE_VERSION} or later (detected ${version}).`,
isWindows
? "Install a newer Node.js from https://nodejs.org, or use the standalone installer:"
: "Switch to Node 20 with `nvm install 20 && nvm use 20`, or use the standalone installer:",
isWindows
? "irm https://feynman.is/install.ps1 | iex"
: "curl -fsSL https://feynman.is/install | bash",
];
}
export function ensureSupportedNodeVersion(version = process.versions.node): void {
if (!isSupportedNodeVersion(version)) {
throw new Error(getUnsupportedNodeVersionLines(version).join("\n"));
}
}

51
src/system/open-url.ts Normal file
View File

@@ -0,0 +1,51 @@
import { spawn } from "node:child_process";
import { resolveExecutable } from "./executables.js";
type ResolveExecutableFn = (name: string, fallbackPaths?: string[]) => string | undefined;
type OpenUrlCommand = {
command: string;
args: string[];
};
export function getOpenUrlCommand(
url: string,
platform = process.platform,
resolveCommand: ResolveExecutableFn = resolveExecutable,
): OpenUrlCommand | undefined {
if (platform === "win32") {
return {
command: "cmd",
args: ["/c", "start", "", url],
};
}
if (platform === "darwin") {
const command = resolveCommand("open");
return command ? { command, args: [url] } : undefined;
}
const command = resolveCommand("xdg-open");
return command ? { command, args: [url] } : undefined;
}
export function openUrl(url: string): boolean {
const command = getOpenUrlCommand(url);
if (!command) {
return false;
}
try {
const child = spawn(command.command, command.args, {
detached: true,
stdio: "ignore",
windowsHide: true,
});
child.on("error", () => {});
child.unref();
return true;
} catch {
return false;
}
}

View File

@@ -1,6 +1,6 @@
import test from "node:test"; import test from "node:test";
import assert from "node:assert/strict"; import assert from "node:assert/strict";
import { mkdtempSync, mkdirSync, readFileSync, writeFileSync } from "node:fs"; import { existsSync, mkdtempSync, mkdirSync, readFileSync, rmSync, writeFileSync } from "node:fs";
import { tmpdir } from "node:os"; import { tmpdir } from "node:os";
import { join } from "node:path"; import { join } from "node:path";
@@ -49,3 +49,34 @@ test("syncBundledAssets preserves user-modified files and updates managed files"
assert.equal(readFileSync(join(agentDir, "themes", "feynman.json"), "utf8"), '{"theme":"v2"}\n'); assert.equal(readFileSync(join(agentDir, "themes", "feynman.json"), "utf8"), '{"theme":"v2"}\n');
assert.equal(readFileSync(join(agentDir, "agents", "researcher.md"), "utf8"), "# user-custom\n"); assert.equal(readFileSync(join(agentDir, "agents", "researcher.md"), "utf8"), "# user-custom\n");
}); });
test("syncBundledAssets removes deleted managed files but preserves user-modified stale files", () => {
const appRoot = createAppRoot();
const home = mkdtempSync(join(tmpdir(), "feynman-home-"));
process.env.FEYNMAN_HOME = home;
const agentDir = join(home, "agent");
mkdirSync(agentDir, { recursive: true });
mkdirSync(join(appRoot, "skills", "paper-eli5"), { recursive: true });
writeFileSync(join(appRoot, "skills", "paper-eli5", "SKILL.md"), "# old skill\n", "utf8");
syncBundledAssets(appRoot, agentDir);
rmSync(join(appRoot, "skills", "paper-eli5"), { recursive: true, force: true });
mkdirSync(join(appRoot, "skills", "eli5"), { recursive: true });
writeFileSync(join(appRoot, "skills", "eli5", "SKILL.md"), "# new skill\n", "utf8");
const firstResult = syncBundledAssets(appRoot, agentDir);
assert.deepEqual(firstResult.copied, ["eli5/SKILL.md"]);
assert.equal(existsSync(join(agentDir, "skills", "paper-eli5", "SKILL.md")), false);
assert.equal(readFileSync(join(agentDir, "skills", "eli5", "SKILL.md"), "utf8"), "# new skill\n");
mkdirSync(join(appRoot, "skills", "legacy"), { recursive: true });
writeFileSync(join(appRoot, "skills", "legacy", "SKILL.md"), "# managed legacy\n", "utf8");
syncBundledAssets(appRoot, agentDir);
writeFileSync(join(agentDir, "skills", "legacy", "SKILL.md"), "# user legacy override\n", "utf8");
rmSync(join(appRoot, "skills", "legacy"), { recursive: true, force: true });
const secondResult = syncBundledAssets(appRoot, agentDir);
assert.deepEqual(secondResult.skipped, ["legacy/SKILL.md"]);
assert.equal(readFileSync(join(agentDir, "skills", "legacy", "SKILL.md"), "utf8"), "# user legacy override\n");
});

View File

@@ -0,0 +1,110 @@
import test from "node:test";
import assert from "node:assert/strict";
import { buildModelStatusSnapshotFromRecords } from "../src/model/catalog.js";
test("buildModelStatusSnapshotFromRecords returns empty guidance when model is set and valid", () => {
const snapshot = buildModelStatusSnapshotFromRecords(
[{ provider: "anthropic", id: "claude-opus-4-6" }],
[{ provider: "anthropic", id: "claude-opus-4-6" }],
"anthropic/claude-opus-4-6",
);
assert.equal(snapshot.currentValid, true);
assert.equal(snapshot.current, "anthropic/claude-opus-4-6");
assert.equal(snapshot.guidance.length, 0);
});
test("buildModelStatusSnapshotFromRecords emits guidance when no models are available", () => {
const snapshot = buildModelStatusSnapshotFromRecords([], [], undefined);
assert.equal(snapshot.currentValid, false);
assert.equal(snapshot.current, undefined);
assert.equal(snapshot.recommended, undefined);
assert.ok(snapshot.guidance.some((line) => line.includes("No authenticated Pi models")));
});
test("buildModelStatusSnapshotFromRecords emits guidance when no default model is set", () => {
const snapshot = buildModelStatusSnapshotFromRecords(
[{ provider: "openai", id: "gpt-5.4" }],
[{ provider: "openai", id: "gpt-5.4" }],
undefined,
);
assert.equal(snapshot.currentValid, false);
assert.equal(snapshot.current, undefined);
assert.ok(snapshot.guidance.some((line) => line.includes("No default research model")));
});
test("buildModelStatusSnapshotFromRecords marks provider as configured only when it has available models", () => {
const snapshot = buildModelStatusSnapshotFromRecords(
[
{ provider: "anthropic", id: "claude-opus-4-6" },
{ provider: "openai", id: "gpt-5.4" },
],
[{ provider: "openai", id: "gpt-5.4" }],
"openai/gpt-5.4",
);
const anthropicProvider = snapshot.providers.find((provider) => provider.id === "anthropic");
const openaiProvider = snapshot.providers.find((provider) => provider.id === "openai");
assert.ok(anthropicProvider);
assert.equal(anthropicProvider!.configured, false);
assert.equal(anthropicProvider!.supportedModels, 1);
assert.equal(anthropicProvider!.availableModels, 0);
assert.ok(openaiProvider);
assert.equal(openaiProvider!.configured, true);
assert.equal(openaiProvider!.supportedModels, 1);
assert.equal(openaiProvider!.availableModels, 1);
});
test("buildModelStatusSnapshotFromRecords marks provider as current when selected model belongs to it", () => {
const snapshot = buildModelStatusSnapshotFromRecords(
[
{ provider: "anthropic", id: "claude-opus-4-6" },
{ provider: "openai", id: "gpt-5.4" },
],
[
{ provider: "anthropic", id: "claude-opus-4-6" },
{ provider: "openai", id: "gpt-5.4" },
],
"anthropic/claude-opus-4-6",
);
const anthropicProvider = snapshot.providers.find((provider) => provider.id === "anthropic");
const openaiProvider = snapshot.providers.find((provider) => provider.id === "openai");
assert.equal(anthropicProvider!.current, true);
assert.equal(openaiProvider!.current, false);
});
test("buildModelStatusSnapshotFromRecords returns available models sorted by research preference", () => {
const snapshot = buildModelStatusSnapshotFromRecords(
[
{ provider: "openai", id: "gpt-5.4" },
{ provider: "anthropic", id: "claude-opus-4-6" },
],
[
{ provider: "openai", id: "gpt-5.4" },
{ provider: "anthropic", id: "claude-opus-4-6" },
],
undefined,
);
assert.equal(snapshot.availableModels[0], "anthropic/claude-opus-4-6");
assert.equal(snapshot.availableModels[1], "openai/gpt-5.4");
assert.equal(snapshot.recommended, "anthropic/claude-opus-4-6");
});
test("buildModelStatusSnapshotFromRecords sets currentValid false when current model is not in available list", () => {
const snapshot = buildModelStatusSnapshotFromRecords(
[{ provider: "anthropic", id: "claude-opus-4-6" }],
[],
"anthropic/claude-opus-4-6",
);
assert.equal(snapshot.currentValid, false);
assert.equal(snapshot.current, "anthropic/claude-opus-4-6");
});

View File

@@ -0,0 +1,92 @@
import test from "node:test";
import assert from "node:assert/strict";
import { existsSync, mkdtempSync, rmSync } from "node:fs";
import { tmpdir } from "node:os";
import { join, resolve } from "node:path";
import {
ensureFeynmanHome,
getBootstrapStatePath,
getDefaultSessionDir,
getFeynmanAgentDir,
getFeynmanHome,
getFeynmanMemoryDir,
getFeynmanStateDir,
} from "../src/config/paths.js";
test("getFeynmanHome uses FEYNMAN_HOME env var when set", () => {
const previous = process.env.FEYNMAN_HOME;
try {
process.env.FEYNMAN_HOME = "/custom/home";
assert.equal(getFeynmanHome(), resolve("/custom/home", ".feynman"));
} finally {
if (previous === undefined) {
delete process.env.FEYNMAN_HOME;
} else {
process.env.FEYNMAN_HOME = previous;
}
}
});
test("getFeynmanHome falls back to homedir when FEYNMAN_HOME is unset", () => {
const previous = process.env.FEYNMAN_HOME;
try {
delete process.env.FEYNMAN_HOME;
const home = getFeynmanHome();
assert.ok(home.endsWith(".feynman"), `expected path ending in .feynman, got: ${home}`);
assert.ok(!home.includes("undefined"), `expected no 'undefined' in path, got: ${home}`);
} finally {
if (previous === undefined) {
delete process.env.FEYNMAN_HOME;
} else {
process.env.FEYNMAN_HOME = previous;
}
}
});
test("getFeynmanAgentDir resolves to <home>/agent", () => {
assert.equal(getFeynmanAgentDir("/some/home"), resolve("/some/home", "agent"));
});
test("getFeynmanMemoryDir resolves to <home>/memory", () => {
assert.equal(getFeynmanMemoryDir("/some/home"), resolve("/some/home", "memory"));
});
test("getFeynmanStateDir resolves to <home>/.state", () => {
assert.equal(getFeynmanStateDir("/some/home"), resolve("/some/home", ".state"));
});
test("getDefaultSessionDir resolves to <home>/sessions", () => {
assert.equal(getDefaultSessionDir("/some/home"), resolve("/some/home", "sessions"));
});
test("getBootstrapStatePath resolves to <home>/.state/bootstrap.json", () => {
assert.equal(getBootstrapStatePath("/some/home"), resolve("/some/home", ".state", "bootstrap.json"));
});
test("ensureFeynmanHome creates all required subdirectories", () => {
const root = mkdtempSync(join(tmpdir(), "feynman-paths-"));
try {
const home = join(root, "home");
ensureFeynmanHome(home);
assert.ok(existsSync(home), "home dir should exist");
assert.ok(existsSync(join(home, "agent")), "agent dir should exist");
assert.ok(existsSync(join(home, "memory")), "memory dir should exist");
assert.ok(existsSync(join(home, ".state")), ".state dir should exist");
assert.ok(existsSync(join(home, "sessions")), "sessions dir should exist");
} finally {
rmSync(root, { recursive: true, force: true });
}
});
test("ensureFeynmanHome is idempotent when dirs already exist", () => {
const root = mkdtempSync(join(tmpdir(), "feynman-paths-"));
try {
const home = join(root, "home");
ensureFeynmanHome(home);
assert.doesNotThrow(() => ensureFeynmanHome(home));
} finally {
rmSync(root, { recursive: true, force: true });
}
});

View File

@@ -0,0 +1,32 @@
import test from "node:test";
import assert from "node:assert/strict";
import { readdirSync, readFileSync } from "node:fs";
import { dirname, join, resolve } from "node:path";
import { fileURLToPath } from "node:url";
const repoRoot = resolve(dirname(fileURLToPath(import.meta.url)), "..");
const bannedPatterns = [/ValiChord/i, /Harmony Record/i, /harmony_record_/i];
function collectMarkdownFiles(root: string): string[] {
const files: string[] = [];
for (const entry of readdirSync(root, { withFileTypes: true })) {
const fullPath = join(root, entry.name);
if (entry.isDirectory()) {
files.push(...collectMarkdownFiles(fullPath));
continue;
}
if (entry.isFile() && fullPath.endsWith(".md")) {
files.push(fullPath);
}
}
return files;
}
test("bundled prompts and skills do not contain blocked promotional product content", () => {
for (const filePath of [...collectMarkdownFiles(join(repoRoot, "prompts")), ...collectMarkdownFiles(join(repoRoot, "skills"))]) {
const content = readFileSync(filePath, "utf8");
for (const pattern of bannedPatterns) {
assert.doesNotMatch(content, pattern, `${filePath} contains blocked promotional pattern ${pattern}`);
}
}
});

View File

@@ -57,6 +57,16 @@ test("buildModelStatusSnapshotFromRecords flags an invalid current model and sug
assert.ok(snapshot.guidance.some((line) => line.includes("Configured default model is unavailable"))); assert.ok(snapshot.guidance.some((line) => line.includes("Configured default model is unavailable")));
}); });
test("chooseRecommendedModel prefers MiniMax M2.7 over highspeed when that is the authenticated provider", () => {
const authPath = createAuthPath({
minimax: { type: "api_key", key: "minimax-test-key" },
});
const recommendation = chooseRecommendedModel(authPath);
assert.equal(recommendation?.spec, "minimax/MiniMax-M2.7");
});
test("resolveInitialPrompt maps top-level research commands to Pi slash workflows", () => { test("resolveInitialPrompt maps top-level research commands to Pi slash workflows", () => {
const workflows = new Set(["lit", "watch", "jobs", "deepresearch"]); const workflows = new Set(["lit", "watch", "jobs", "deepresearch"]);
assert.equal(resolveInitialPrompt("lit", ["tool-using", "agents"], undefined, workflows), "/lit tool-using agents"); assert.equal(resolveInitialPrompt("lit", ["tool-using", "agents"], undefined, workflows), "/lit tool-using agents");
@@ -65,4 +75,3 @@ test("resolveInitialPrompt maps top-level research commands to Pi slash workflow
assert.equal(resolveInitialPrompt("chat", ["hello"], undefined, workflows), "hello"); assert.equal(resolveInitialPrompt("chat", ["hello"], undefined, workflows), "hello");
assert.equal(resolveInitialPrompt("unknown", ["topic"], undefined, workflows), "unknown topic"); assert.equal(resolveInitialPrompt("unknown", ["topic"], undefined, workflows), "unknown topic");
}); });

32
tests/models-json.test.ts Normal file
View File

@@ -0,0 +1,32 @@
import test from "node:test";
import assert from "node:assert/strict";
import { mkdtempSync, readFileSync } from "node:fs";
import { tmpdir } from "node:os";
import { join } from "node:path";
import { upsertProviderConfig } from "../src/model/models-json.js";
test("upsertProviderConfig creates models.json and merges provider config", () => {
const dir = mkdtempSync(join(tmpdir(), "feynman-models-"));
const modelsPath = join(dir, "models.json");
const first = upsertProviderConfig(modelsPath, "custom", {
baseUrl: "http://localhost:11434/v1",
apiKey: "ollama",
api: "openai-completions",
authHeader: true,
models: [{ id: "llama3.1:8b" }],
});
assert.deepEqual(first, { ok: true });
const second = upsertProviderConfig(modelsPath, "custom", {
baseUrl: "http://localhost:9999/v1",
});
assert.deepEqual(second, { ok: true });
const parsed = JSON.parse(readFileSync(modelsPath, "utf8")) as any;
assert.equal(parsed.providers.custom.baseUrl, "http://localhost:9999/v1");
assert.equal(parsed.providers.custom.api, "openai-completions");
assert.equal(parsed.providers.custom.authHeader, true);
assert.deepEqual(parsed.providers.custom.models, [{ id: "llama3.1:8b" }]);
});

View File

@@ -0,0 +1,35 @@
import test from "node:test";
import assert from "node:assert/strict";
import {
MIN_NODE_VERSION,
ensureSupportedNodeVersion,
getUnsupportedNodeVersionLines,
isSupportedNodeVersion,
} from "../src/system/node-version.js";
test("isSupportedNodeVersion enforces the exact minimum floor", () => {
assert.equal(isSupportedNodeVersion("20.19.0"), true);
assert.equal(isSupportedNodeVersion("20.19.0"), true);
assert.equal(isSupportedNodeVersion("21.0.0"), true);
assert.equal(isSupportedNodeVersion("20.18.1"), false);
assert.equal(isSupportedNodeVersion("18.17.0"), false);
});
test("ensureSupportedNodeVersion throws a guided upgrade message", () => {
assert.throws(
() => ensureSupportedNodeVersion("18.17.0"),
(error: unknown) =>
error instanceof Error &&
error.message.includes(`Node.js ${MIN_NODE_VERSION}`) &&
error.message.includes("nvm install 20 && nvm use 20") &&
error.message.includes("https://feynman.is/install"),
);
});
test("unsupported version guidance reports the detected version", () => {
const lines = getUnsupportedNodeVersionLines("18.17.0");
assert.equal(lines[0], "feynman requires Node.js 20.19.0 or later (detected 18.17.0).");
assert.ok(lines.some((line) => line.includes("curl -fsSL https://feynman.is/install | bash")));
});

45
tests/open-url.test.ts Normal file
View File

@@ -0,0 +1,45 @@
import test from "node:test";
import assert from "node:assert/strict";
import { getOpenUrlCommand } from "../src/system/open-url.js";
test("getOpenUrlCommand uses open on macOS when available", () => {
const command = getOpenUrlCommand(
"https://example.com",
"darwin",
(name) => (name === "open" ? "/usr/bin/open" : undefined),
);
assert.deepEqual(command, {
command: "/usr/bin/open",
args: ["https://example.com"],
});
});
test("getOpenUrlCommand uses xdg-open on Linux when available", () => {
const command = getOpenUrlCommand(
"https://example.com",
"linux",
(name) => (name === "xdg-open" ? "/usr/bin/xdg-open" : undefined),
);
assert.deepEqual(command, {
command: "/usr/bin/xdg-open",
args: ["https://example.com"],
});
});
test("getOpenUrlCommand uses cmd start on Windows", () => {
const command = getOpenUrlCommand("https://example.com", "win32");
assert.deepEqual(command, {
command: "cmd",
args: ["/c", "start", "", "https://example.com"],
});
});
test("getOpenUrlCommand returns undefined when no opener is available", () => {
const command = getOpenUrlCommand("https://example.com", "linux", () => undefined);
assert.equal(command, undefined);
});

View File

@@ -0,0 +1,42 @@
import test from "node:test";
import assert from "node:assert/strict";
import { patchPiExtensionLoaderSource } from "../scripts/lib/pi-extension-loader-patch.mjs";
test("patchPiExtensionLoaderSource rewrites Windows extension imports to file URLs", () => {
const input = [
'import * as path from "node:path";',
'import { fileURLToPath } from "node:url";',
"async function loadExtensionModule(extensionPath) {",
" const jiti = createJiti(import.meta.url);",
' const module = await jiti.import(extensionPath, { default: true });',
" return module;",
"}",
"",
].join("\n");
const patched = patchPiExtensionLoaderSource(input);
assert.match(patched, /pathToFileURL/);
assert.match(patched, /process\.platform === "win32"/);
assert.match(patched, /path\.isAbsolute\(extensionPath\)/);
assert.match(patched, /jiti\.import\(extensionSpecifier, \{ default: true \}\)/);
});
test("patchPiExtensionLoaderSource is idempotent", () => {
const input = [
'import * as path from "node:path";',
'import { fileURLToPath } from "node:url";',
"async function loadExtensionModule(extensionPath) {",
" const jiti = createJiti(import.meta.url);",
' const module = await jiti.import(extensionPath, { default: true });',
" return module;",
"}",
"",
].join("\n");
const once = patchPiExtensionLoaderSource(input);
const twice = patchPiExtensionLoaderSource(once);
assert.equal(twice, once);
});

View File

@@ -0,0 +1,42 @@
import test from "node:test";
import assert from "node:assert/strict";
import { patchPiGoogleLegacySchemaSource } from "../scripts/lib/pi-google-legacy-schema-patch.mjs";
test("patchPiGoogleLegacySchemaSource rewrites legacy parameters conversion to normalize const", () => {
const input = [
"export function convertTools(tools, useParameters = false) {",
" if (tools.length === 0) return undefined;",
" return [",
" {",
" functionDeclarations: tools.map((tool) => ({",
" name: tool.name,",
" description: tool.description,",
' ...(useParameters ? { parameters: tool.parameters } : { parametersJsonSchema: tool.parameters }),',
" })),",
" },",
" ];",
"}",
"",
].join("\n");
const patched = patchPiGoogleLegacySchemaSource(input);
assert.match(patched, /function normalizeLegacyToolSchema\(schema\)/);
assert.match(patched, /normalized\.enum = \[value\]/);
assert.match(patched, /parameters: normalizeLegacyToolSchema\(tool\.parameters\)/);
});
test("patchPiGoogleLegacySchemaSource is idempotent", () => {
const input = [
"export function convertTools(tools, useParameters = false) {",
' ...(useParameters ? { parameters: tool.parameters } : { parametersJsonSchema: tool.parameters }),',
"}",
"",
].join("\n");
const once = patchPiGoogleLegacySchemaSource(input);
const twice = patchPiGoogleLegacySchemaSource(once);
assert.equal(twice, once);
});

9
tests/pi-launch.test.ts Normal file
View File

@@ -0,0 +1,9 @@
import test from "node:test";
import assert from "node:assert/strict";
import { exitCodeFromSignal } from "../src/pi/launch.js";
test("exitCodeFromSignal maps POSIX signals to conventional shell exit codes", () => {
assert.equal(exitCodeFromSignal("SIGTERM"), 143);
assert.equal(exitCodeFromSignal("SIGSEGV"), 139);
});

View File

@@ -1,7 +1,8 @@
import test from "node:test"; import test from "node:test";
import assert from "node:assert/strict"; import assert from "node:assert/strict";
import { pathToFileURL } from "node:url";
import { buildPiArgs, buildPiEnv, resolvePiPaths } from "../src/pi/runtime.js"; import { applyFeynmanPackageManagerEnv, buildPiArgs, buildPiEnv, resolvePiPaths, toNodeImportSpecifier } from "../src/pi/runtime.js";
test("buildPiArgs includes configured runtime paths and prompt", () => { test("buildPiArgs includes configured runtime paths and prompt", () => {
const args = buildPiArgs({ const args = buildPiArgs({
@@ -9,6 +10,7 @@ test("buildPiArgs includes configured runtime paths and prompt", () => {
workingDir: "/workspace", workingDir: "/workspace",
sessionDir: "/sessions", sessionDir: "/sessions",
feynmanAgentDir: "/home/.feynman/agent", feynmanAgentDir: "/home/.feynman/agent",
mode: "rpc",
initialPrompt: "hello", initialPrompt: "hello",
explicitModelSpec: "openai:gpt-5.4", explicitModelSpec: "openai:gpt-5.4",
thinkingLevel: "medium", thinkingLevel: "medium",
@@ -21,6 +23,8 @@ test("buildPiArgs includes configured runtime paths and prompt", () => {
"/repo/feynman/extensions/research-tools.ts", "/repo/feynman/extensions/research-tools.ts",
"--prompt-template", "--prompt-template",
"/repo/feynman/prompts", "/repo/feynman/prompts",
"--mode",
"rpc",
"--model", "--model",
"openai:gpt-5.4", "openai:gpt-5.4",
"--thinking", "--thinking",
@@ -30,6 +34,11 @@ test("buildPiArgs includes configured runtime paths and prompt", () => {
}); });
test("buildPiEnv wires Feynman paths into the Pi environment", () => { test("buildPiEnv wires Feynman paths into the Pi environment", () => {
const previousUppercasePrefix = process.env.NPM_CONFIG_PREFIX;
const previousLowercasePrefix = process.env.npm_config_prefix;
process.env.NPM_CONFIG_PREFIX = "/tmp/global-prefix";
process.env.npm_config_prefix = "/tmp/global-prefix-lower";
const env = buildPiEnv({ const env = buildPiEnv({
appRoot: "/repo/feynman", appRoot: "/repo/feynman",
workingDir: "/workspace", workingDir: "/workspace",
@@ -38,9 +47,62 @@ test("buildPiEnv wires Feynman paths into the Pi environment", () => {
feynmanVersion: "0.1.5", feynmanVersion: "0.1.5",
}); });
assert.equal(env.FEYNMAN_SESSION_DIR, "/sessions"); try {
assert.equal(env.FEYNMAN_BIN_PATH, "/repo/feynman/bin/feynman.js"); assert.equal(env.FEYNMAN_SESSION_DIR, "/sessions");
assert.equal(env.FEYNMAN_MEMORY_DIR, "/home/.feynman/memory"); assert.equal(env.FEYNMAN_BIN_PATH, "/repo/feynman/bin/feynman.js");
assert.equal(env.FEYNMAN_MEMORY_DIR, "/home/.feynman/memory");
assert.equal(env.FEYNMAN_NPM_PREFIX, "/home/.feynman/npm-global");
assert.equal(env.NPM_CONFIG_PREFIX, "/home/.feynman/npm-global");
assert.equal(env.npm_config_prefix, "/home/.feynman/npm-global");
assert.equal(env.PI_CODING_AGENT_DIR, "/home/.feynman/agent");
assert.ok(
env.PATH?.startsWith(
"/repo/feynman/node_modules/.bin:/repo/feynman/.feynman/npm/node_modules/.bin:/home/.feynman/npm-global/bin:",
),
);
} finally {
if (previousUppercasePrefix === undefined) {
delete process.env.NPM_CONFIG_PREFIX;
} else {
process.env.NPM_CONFIG_PREFIX = previousUppercasePrefix;
}
if (previousLowercasePrefix === undefined) {
delete process.env.npm_config_prefix;
} else {
process.env.npm_config_prefix = previousLowercasePrefix;
}
}
});
test("applyFeynmanPackageManagerEnv pins npm globals to the Feynman prefix", () => {
const previousFeynmanPrefix = process.env.FEYNMAN_NPM_PREFIX;
const previousUppercasePrefix = process.env.NPM_CONFIG_PREFIX;
const previousLowercasePrefix = process.env.npm_config_prefix;
try {
const prefix = applyFeynmanPackageManagerEnv("/home/.feynman/agent");
assert.equal(prefix, "/home/.feynman/npm-global");
assert.equal(process.env.FEYNMAN_NPM_PREFIX, "/home/.feynman/npm-global");
assert.equal(process.env.NPM_CONFIG_PREFIX, "/home/.feynman/npm-global");
assert.equal(process.env.npm_config_prefix, "/home/.feynman/npm-global");
} finally {
if (previousFeynmanPrefix === undefined) {
delete process.env.FEYNMAN_NPM_PREFIX;
} else {
process.env.FEYNMAN_NPM_PREFIX = previousFeynmanPrefix;
}
if (previousUppercasePrefix === undefined) {
delete process.env.NPM_CONFIG_PREFIX;
} else {
process.env.NPM_CONFIG_PREFIX = previousUppercasePrefix;
}
if (previousLowercasePrefix === undefined) {
delete process.env.npm_config_prefix;
} else {
process.env.npm_config_prefix = previousLowercasePrefix;
}
}
}); });
test("resolvePiPaths includes the Promise.withResolvers polyfill path", () => { test("resolvePiPaths includes the Promise.withResolvers polyfill path", () => {
@@ -48,3 +110,11 @@ test("resolvePiPaths includes the Promise.withResolvers polyfill path", () => {
assert.equal(paths.promisePolyfillPath, "/repo/feynman/dist/system/promise-polyfill.js"); assert.equal(paths.promisePolyfillPath, "/repo/feynman/dist/system/promise-polyfill.js");
}); });
test("toNodeImportSpecifier converts absolute preload paths to file URLs", () => {
assert.equal(
toNodeImportSpecifier("/repo/feynman/dist/system/promise-polyfill.js"),
pathToFileURL("/repo/feynman/dist/system/promise-polyfill.js").href,
);
assert.equal(toNodeImportSpecifier("tsx"), "tsx");
});

View File

@@ -0,0 +1,104 @@
import test from "node:test";
import assert from "node:assert/strict";
import { patchPiSubagentsSource } from "../scripts/lib/pi-subagents-patch.mjs";
const CASES = [
{
name: "index.ts config path",
file: "index.ts",
input: [
'import * as os from "node:os";',
'import * as path from "node:path";',
'const configPath = path.join(os.homedir(), ".pi", "agent", "extensions", "subagent", "config.json");',
"",
].join("\n"),
original: 'const configPath = path.join(os.homedir(), ".pi", "agent", "extensions", "subagent", "config.json");',
expected: 'const configPath = path.join(resolvePiAgentDir(), "extensions", "subagent", "config.json");',
},
{
name: "agents.ts user agents dir",
file: "agents.ts",
input: [
'import * as os from "node:os";',
'import * as path from "node:path";',
'const userDir = path.join(os.homedir(), ".pi", "agent", "agents");',
"",
].join("\n"),
original: 'const userDir = path.join(os.homedir(), ".pi", "agent", "agents");',
expected: 'const userDir = path.join(resolvePiAgentDir(), "agents");',
},
{
name: "artifacts.ts sessions dir",
file: "artifacts.ts",
input: [
'import * as os from "node:os";',
'import * as path from "node:path";',
'const sessionsBase = path.join(os.homedir(), ".pi", "agent", "sessions");',
"",
].join("\n"),
original: 'const sessionsBase = path.join(os.homedir(), ".pi", "agent", "sessions");',
expected: 'const sessionsBase = path.join(resolvePiAgentDir(), "sessions");',
},
{
name: "run-history.ts history file",
file: "run-history.ts",
input: [
'import * as os from "node:os";',
'import * as path from "node:path";',
'const HISTORY_PATH = path.join(os.homedir(), ".pi", "agent", "run-history.jsonl");',
"",
].join("\n"),
original: 'const HISTORY_PATH = path.join(os.homedir(), ".pi", "agent", "run-history.jsonl");',
expected: 'const HISTORY_PATH = path.join(resolvePiAgentDir(), "run-history.jsonl");',
},
{
name: "skills.ts agent dir",
file: "skills.ts",
input: [
'import * as os from "node:os";',
'import * as path from "node:path";',
'const AGENT_DIR = path.join(os.homedir(), ".pi", "agent");',
"",
].join("\n"),
original: 'const AGENT_DIR = path.join(os.homedir(), ".pi", "agent");',
expected: "const AGENT_DIR = resolvePiAgentDir();",
},
{
name: "chain-clarify.ts chain save dir",
file: "chain-clarify.ts",
input: [
'import * as os from "node:os";',
'import * as path from "node:path";',
'const dir = path.join(os.homedir(), ".pi", "agent", "agents");',
"",
].join("\n"),
original: 'const dir = path.join(os.homedir(), ".pi", "agent", "agents");',
expected: 'const dir = path.join(resolvePiAgentDir(), "agents");',
},
];
for (const scenario of CASES) {
test(`patchPiSubagentsSource rewrites ${scenario.name}`, () => {
const patched = patchPiSubagentsSource(scenario.file, scenario.input);
assert.match(patched, /function resolvePiAgentDir\(\): string \{/);
assert.match(patched, /process\.env\.PI_CODING_AGENT_DIR\?\.trim\(\)/);
assert.ok(patched.includes(scenario.expected));
assert.ok(!patched.includes(scenario.original));
});
}
test("patchPiSubagentsSource is idempotent", () => {
const input = [
'import * as os from "node:os";',
'import * as path from "node:path";',
'const configPath = path.join(os.homedir(), ".pi", "agent", "extensions", "subagent", "config.json");',
"",
].join("\n");
const once = patchPiSubagentsSource("index.ts", input);
const twice = patchPiSubagentsSource("index.ts", once);
assert.equal(twice, once);
});

View File

@@ -0,0 +1,48 @@
import test from "node:test";
import assert from "node:assert/strict";
import { patchPiWebAccessSource } from "../scripts/lib/pi-web-access-patch.mjs";
test("patchPiWebAccessSource rewrites legacy Pi web-search config paths", () => {
const input = [
'import { join } from "node:path";',
'import { homedir } from "node:os";',
'const CONFIG_PATH = join(homedir(), ".pi", "web-search.json");',
"",
].join("\n");
const patched = patchPiWebAccessSource("perplexity.ts", input);
assert.match(patched, /FEYNMAN_WEB_SEARCH_CONFIG/);
assert.match(patched, /PI_WEB_SEARCH_CONFIG/);
});
test("patchPiWebAccessSource updates index.ts directory handling", () => {
const input = [
'import { existsSync, mkdirSync } from "node:fs";',
'import { join } from "node:path";',
'import { homedir } from "node:os";',
'const WEB_SEARCH_CONFIG_PATH = join(homedir(), ".pi", "web-search.json");',
'const dir = join(homedir(), ".pi");',
"",
].join("\n");
const patched = patchPiWebAccessSource("index.ts", input);
assert.match(patched, /import \{ dirname, join \} from "node:path";/);
assert.match(patched, /const dir = dirname\(WEB_SEARCH_CONFIG_PATH\);/);
});
test("patchPiWebAccessSource is idempotent", () => {
const input = [
'import { join } from "node:path";',
'import { homedir } from "node:os";',
'const CONFIG_PATH = join(homedir(), ".pi", "web-search.json");',
"",
].join("\n");
const once = patchPiWebAccessSource("perplexity.ts", input);
const twice = patchPiWebAccessSource("perplexity.ts", once);
assert.equal(twice, once);
});

View File

@@ -9,6 +9,7 @@ import {
getPiWebAccessStatus, getPiWebAccessStatus,
getPiWebSearchConfigPath, getPiWebSearchConfigPath,
loadPiWebAccessConfig, loadPiWebAccessConfig,
savePiWebAccessConfig,
} from "../src/pi/web-access.js"; } from "../src/pi/web-access.js";
test("loadPiWebAccessConfig returns empty config when Pi web config is missing", () => { test("loadPiWebAccessConfig returns empty config when Pi web config is missing", () => {
@@ -18,7 +19,56 @@ test("loadPiWebAccessConfig returns empty config when Pi web config is missing",
assert.deepEqual(loadPiWebAccessConfig(configPath), {}); assert.deepEqual(loadPiWebAccessConfig(configPath), {});
}); });
test("getPiWebSearchConfigPath respects FEYNMAN_HOME semantics", () => {
assert.equal(getPiWebSearchConfigPath("/tmp/custom-home"), "/tmp/custom-home/.feynman/web-search.json");
});
test("savePiWebAccessConfig merges updates and deletes undefined values", () => {
const root = mkdtempSync(join(tmpdir(), "feynman-pi-web-"));
const configPath = getPiWebSearchConfigPath(root);
savePiWebAccessConfig({
provider: "perplexity",
searchProvider: "perplexity",
perplexityApiKey: "pplx_...",
}, configPath);
savePiWebAccessConfig({
provider: undefined,
searchProvider: undefined,
route: undefined,
}, configPath);
assert.deepEqual(loadPiWebAccessConfig(configPath), {
perplexityApiKey: "pplx_...",
});
});
test("getPiWebAccessStatus reads Pi web-access config directly", () => { test("getPiWebAccessStatus reads Pi web-access config directly", () => {
const root = mkdtempSync(join(tmpdir(), "feynman-pi-web-"));
const configPath = getPiWebSearchConfigPath(root);
mkdirSync(join(root, ".feynman"), { recursive: true });
writeFileSync(
configPath,
JSON.stringify({
provider: "exa",
searchProvider: "exa",
exaApiKey: "exa_...",
chromeProfile: "Profile 2",
geminiApiKey: "AIza...",
}),
"utf8",
);
const status = getPiWebAccessStatus(loadPiWebAccessConfig(configPath), configPath);
assert.equal(status.routeLabel, "Exa");
assert.equal(status.requestProvider, "exa");
assert.equal(status.exaConfigured, true);
assert.equal(status.geminiApiConfigured, true);
assert.equal(status.perplexityConfigured, false);
assert.equal(status.chromeProfile, "Profile 2");
});
test("getPiWebAccessStatus reads Gemini routes directly", () => {
const root = mkdtempSync(join(tmpdir(), "feynman-pi-web-")); const root = mkdtempSync(join(tmpdir(), "feynman-pi-web-"));
const configPath = getPiWebSearchConfigPath(root); const configPath = getPiWebSearchConfigPath(root);
mkdirSync(join(root, ".feynman"), { recursive: true }); mkdirSync(join(root, ".feynman"), { recursive: true });
@@ -36,11 +86,23 @@ test("getPiWebAccessStatus reads Pi web-access config directly", () => {
const status = getPiWebAccessStatus(loadPiWebAccessConfig(configPath), configPath); const status = getPiWebAccessStatus(loadPiWebAccessConfig(configPath), configPath);
assert.equal(status.routeLabel, "Gemini"); assert.equal(status.routeLabel, "Gemini");
assert.equal(status.requestProvider, "gemini"); assert.equal(status.requestProvider, "gemini");
assert.equal(status.exaConfigured, false);
assert.equal(status.geminiApiConfigured, true); assert.equal(status.geminiApiConfigured, true);
assert.equal(status.perplexityConfigured, false); assert.equal(status.perplexityConfigured, false);
assert.equal(status.chromeProfile, "Profile 2"); assert.equal(status.chromeProfile, "Profile 2");
}); });
test("getPiWebAccessStatus supports the legacy route key", () => {
const status = getPiWebAccessStatus({
route: "perplexity",
perplexityApiKey: "pplx_...",
});
assert.equal(status.routeLabel, "Perplexity");
assert.equal(status.requestProvider, "perplexity");
assert.equal(status.perplexityConfigured, true);
});
test("formatPiWebAccessDoctorLines reports Pi-managed web access", () => { test("formatPiWebAccessDoctorLines reports Pi-managed web access", () => {
const lines = formatPiWebAccessDoctorLines( const lines = formatPiWebAccessDoctorLines(
getPiWebAccessStatus({ getPiWebAccessStatus({

View File

@@ -0,0 +1,41 @@
import test from "node:test";
import assert from "node:assert/strict";
import { mkdtempSync, readFileSync } from "node:fs";
import { tmpdir } from "node:os";
import { join } from "node:path";
import {
getConfiguredServiceTier,
normalizeServiceTier,
resolveProviderServiceTier,
setConfiguredServiceTier,
} from "../src/model/service-tier.js";
test("normalizeServiceTier accepts supported values only", () => {
assert.equal(normalizeServiceTier("priority"), "priority");
assert.equal(normalizeServiceTier("standard_only"), "standard_only");
assert.equal(normalizeServiceTier("FAST"), undefined);
assert.equal(normalizeServiceTier(undefined), undefined);
});
test("setConfiguredServiceTier persists and clears settings.json values", () => {
const dir = mkdtempSync(join(tmpdir(), "feynman-service-tier-"));
const settingsPath = join(dir, "settings.json");
setConfiguredServiceTier(settingsPath, "priority");
assert.equal(getConfiguredServiceTier(settingsPath), "priority");
const persisted = JSON.parse(readFileSync(settingsPath, "utf8")) as { serviceTier?: string };
assert.equal(persisted.serviceTier, "priority");
setConfiguredServiceTier(settingsPath, undefined);
assert.equal(getConfiguredServiceTier(settingsPath), undefined);
});
test("resolveProviderServiceTier filters unsupported provider+tier pairs", () => {
assert.equal(resolveProviderServiceTier("openai", "priority"), "priority");
assert.equal(resolveProviderServiceTier("openai-codex", "flex"), "flex");
assert.equal(resolveProviderServiceTier("anthropic", "standard_only"), "standard_only");
assert.equal(resolveProviderServiceTier("anthropic", "priority"), undefined);
assert.equal(resolveProviderServiceTier("google", "priority"), undefined);
});

28
tests/skill-paths.test.ts Normal file
View File

@@ -0,0 +1,28 @@
import test from "node:test";
import assert from "node:assert/strict";
import { existsSync, readdirSync, readFileSync } from "node:fs";
import { dirname, join, resolve } from "node:path";
import { fileURLToPath } from "node:url";
const repoRoot = resolve(dirname(fileURLToPath(import.meta.url)), "..");
const skillsRoot = join(repoRoot, "skills");
const markdownPathPattern = /`((?:\.\.?\/)(?:[A-Za-z0-9._-]+\/)*[A-Za-z0-9._-]+\.md)`/g;
const simulatedInstallRoot = join(repoRoot, "__skill-install-root__");
test("all local markdown references in bundled skills resolve in the installed skill layout", () => {
for (const entry of readdirSync(skillsRoot, { withFileTypes: true })) {
if (!entry.isDirectory()) continue;
const skillPath = join(skillsRoot, entry.name, "SKILL.md");
if (!existsSync(skillPath)) continue;
const content = readFileSync(skillPath, "utf8");
for (const match of content.matchAll(markdownPathPattern)) {
const reference = match[1];
const installedSkillDir = join(simulatedInstallRoot, entry.name);
const installedTarget = resolve(installedSkillDir, reference);
const repoTarget = installedTarget.replace(simulatedInstallRoot, repoRoot);
assert.ok(existsSync(repoTarget), `${skillPath} references missing installed markdown file ${reference}`);
}
}
});

File diff suppressed because one or more lines are too long

View File

@@ -26,6 +26,7 @@
"tw-animate-css": "^1.4.0" "tw-animate-css": "^1.4.0"
}, },
"devDependencies": { "devDependencies": {
"@astrojs/check": "^0.9.8",
"@eslint/js": "^9.39.4", "@eslint/js": "^9.39.4",
"eslint": "^9.39.4", "eslint": "^9.39.4",
"eslint-plugin-react-hooks": "^7.0.1", "eslint-plugin-react-hooks": "^7.0.1",
@@ -36,6 +37,68 @@
"prettier-plugin-tailwindcss": "^0.7.2", "prettier-plugin-tailwindcss": "^0.7.2",
"typescript": "~5.9.3", "typescript": "~5.9.3",
"typescript-eslint": "^8.57.1" "typescript-eslint": "^8.57.1"
},
"engines": {
"node": ">=20.19.0"
}
},
"node_modules/@astrojs/check": {
"version": "0.9.8",
"resolved": "https://registry.npmjs.org/@astrojs/check/-/check-0.9.8.tgz",
"integrity": "sha512-LDng8446QLS5ToKjRHd3bgUdirvemVVExV7nRyJfW2wV36xuv7vDxwy5NWN9zqeSEDgg0Tv84sP+T3yEq+Zlkw==",
"dev": true,
"license": "MIT",
"dependencies": {
"@astrojs/language-server": "^2.16.5",
"chokidar": "^4.0.3",
"kleur": "^4.1.5",
"yargs": "^17.7.2"
},
"bin": {
"astro-check": "bin/astro-check.js"
},
"peerDependencies": {
"typescript": "^5.0.0"
}
},
"node_modules/@astrojs/check/node_modules/chokidar": {
"version": "4.0.3",
"resolved": "https://registry.npmjs.org/chokidar/-/chokidar-4.0.3.tgz",
"integrity": "sha512-Qgzu8kfBvo+cA4962jnP1KkS6Dop5NS6g7R5LFYJr4b8Ub94PPQXUksCw9PvXoeXPRRddRNC5C1JQUR2SMGtnA==",
"dev": true,
"license": "MIT",
"dependencies": {
"readdirp": "^4.0.1"
},
"engines": {
"node": ">= 14.16.0"
},
"funding": {
"url": "https://paulmillr.com/funding/"
}
},
"node_modules/@astrojs/check/node_modules/kleur": {
"version": "4.1.5",
"resolved": "https://registry.npmjs.org/kleur/-/kleur-4.1.5.tgz",
"integrity": "sha512-o+NO+8WrRiQEE4/7nwRJhN1HWpVmJm511pBHUxPLtp0BUISzlBplORYSmTclCnJvQq2tKu/sgl3xVpkc7ZWuQQ==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=6"
}
},
"node_modules/@astrojs/check/node_modules/readdirp": {
"version": "4.1.2",
"resolved": "https://registry.npmjs.org/readdirp/-/readdirp-4.1.2.tgz",
"integrity": "sha512-GDhwkLfywWL2s6vEjyhri+eXmfH6j1L7JE27WhqLeYzoh/A3DBaYGEj2H/HFZCn/kMfim73FXxEJTw06WtxQwg==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">= 14.18.0"
},
"funding": {
"type": "individual",
"url": "https://paulmillr.com/funding/"
} }
}, },
"node_modules/@astrojs/compiler": { "node_modules/@astrojs/compiler": {
@@ -50,6 +113,48 @@
"integrity": "sha512-GOle7smBWKfMSP8osUIGOlB5kaHdQLV3foCsf+5Q9Wsuu+C6Fs3Ez/ttXmhjZ1HkSgsogcM1RXSjjOVieHq16Q==", "integrity": "sha512-GOle7smBWKfMSP8osUIGOlB5kaHdQLV3foCsf+5Q9Wsuu+C6Fs3Ez/ttXmhjZ1HkSgsogcM1RXSjjOVieHq16Q==",
"license": "MIT" "license": "MIT"
}, },
"node_modules/@astrojs/language-server": {
"version": "2.16.6",
"resolved": "https://registry.npmjs.org/@astrojs/language-server/-/language-server-2.16.6.tgz",
"integrity": "sha512-N990lu+HSFiG57owR0XBkr02BYMgiLCshLf+4QG4v6jjSWkBeQGnzqi+E1L08xFPPJ7eEeXnxPXGLaVv5pa4Ug==",
"dev": true,
"license": "MIT",
"dependencies": {
"@astrojs/compiler": "^2.13.1",
"@astrojs/yaml2ts": "^0.2.3",
"@jridgewell/sourcemap-codec": "^1.5.5",
"@volar/kit": "~2.4.28",
"@volar/language-core": "~2.4.28",
"@volar/language-server": "~2.4.28",
"@volar/language-service": "~2.4.28",
"muggle-string": "^0.4.1",
"tinyglobby": "^0.2.15",
"volar-service-css": "0.0.70",
"volar-service-emmet": "0.0.70",
"volar-service-html": "0.0.70",
"volar-service-prettier": "0.0.70",
"volar-service-typescript": "0.0.70",
"volar-service-typescript-twoslash-queries": "0.0.70",
"volar-service-yaml": "0.0.70",
"vscode-html-languageservice": "^5.6.2",
"vscode-uri": "^3.1.0"
},
"bin": {
"astro-ls": "bin/nodeServer.js"
},
"peerDependencies": {
"prettier": "^3.0.0",
"prettier-plugin-astro": ">=0.11.0"
},
"peerDependenciesMeta": {
"prettier": {
"optional": true
},
"prettier-plugin-astro": {
"optional": true
}
}
},
"node_modules/@astrojs/markdown-remark": { "node_modules/@astrojs/markdown-remark": {
"version": "6.3.11", "version": "6.3.11",
"resolved": "https://registry.npmjs.org/@astrojs/markdown-remark/-/markdown-remark-6.3.11.tgz", "resolved": "https://registry.npmjs.org/@astrojs/markdown-remark/-/markdown-remark-6.3.11.tgz",
@@ -129,6 +234,16 @@
"node": "18.20.8 || ^20.3.0 || >=22.0.0" "node": "18.20.8 || ^20.3.0 || >=22.0.0"
} }
}, },
"node_modules/@astrojs/yaml2ts": {
"version": "0.2.3",
"resolved": "https://registry.npmjs.org/@astrojs/yaml2ts/-/yaml2ts-0.2.3.tgz",
"integrity": "sha512-PJzRmgQzUxI2uwpdX2lXSHtP4G8ocp24/t+bZyf5Fy0SZLSF9f9KXZoMlFM/XCGue+B0nH/2IZ7FpBYQATBsCg==",
"dev": true,
"license": "MIT",
"dependencies": {
"yaml": "^2.8.2"
}
},
"node_modules/@babel/code-frame": { "node_modules/@babel/code-frame": {
"version": "7.29.0", "version": "7.29.0",
"resolved": "https://registry.npmjs.org/@babel/code-frame/-/code-frame-7.29.0.tgz", "resolved": "https://registry.npmjs.org/@babel/code-frame/-/code-frame-7.29.0.tgz",
@@ -735,6 +850,68 @@
"@noble/ciphers": "^1.0.0" "@noble/ciphers": "^1.0.0"
} }
}, },
"node_modules/@emmetio/abbreviation": {
"version": "2.3.3",
"resolved": "https://registry.npmjs.org/@emmetio/abbreviation/-/abbreviation-2.3.3.tgz",
"integrity": "sha512-mgv58UrU3rh4YgbE/TzgLQwJ3pFsHHhCLqY20aJq+9comytTXUDNGG/SMtSeMJdkpxgXSXunBGLD8Boka3JyVA==",
"dev": true,
"license": "MIT",
"dependencies": {
"@emmetio/scanner": "^1.0.4"
}
},
"node_modules/@emmetio/css-abbreviation": {
"version": "2.1.8",
"resolved": "https://registry.npmjs.org/@emmetio/css-abbreviation/-/css-abbreviation-2.1.8.tgz",
"integrity": "sha512-s9yjhJ6saOO/uk1V74eifykk2CBYi01STTK3WlXWGOepyKa23ymJ053+DNQjpFcy1ingpaO7AxCcwLvHFY9tuw==",
"dev": true,
"license": "MIT",
"dependencies": {
"@emmetio/scanner": "^1.0.4"
}
},
"node_modules/@emmetio/css-parser": {
"version": "0.4.1",
"resolved": "https://registry.npmjs.org/@emmetio/css-parser/-/css-parser-0.4.1.tgz",
"integrity": "sha512-2bC6m0MV/voF4CTZiAbG5MWKbq5EBmDPKu9Sb7s7nVcEzNQlrZP6mFFFlIaISM8X6514H9shWMme1fCm8cWAfQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"@emmetio/stream-reader": "^2.2.0",
"@emmetio/stream-reader-utils": "^0.1.0"
}
},
"node_modules/@emmetio/html-matcher": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/@emmetio/html-matcher/-/html-matcher-1.3.0.tgz",
"integrity": "sha512-NTbsvppE5eVyBMuyGfVu2CRrLvo7J4YHb6t9sBFLyY03WYhXET37qA4zOYUjBWFCRHO7pS1B9khERtY0f5JXPQ==",
"dev": true,
"license": "ISC",
"dependencies": {
"@emmetio/scanner": "^1.0.0"
}
},
"node_modules/@emmetio/scanner": {
"version": "1.0.4",
"resolved": "https://registry.npmjs.org/@emmetio/scanner/-/scanner-1.0.4.tgz",
"integrity": "sha512-IqRuJtQff7YHHBk4G8YZ45uB9BaAGcwQeVzgj/zj8/UdOhtQpEIupUhSk8dys6spFIWVZVeK20CzGEnqR5SbqA==",
"dev": true,
"license": "MIT"
},
"node_modules/@emmetio/stream-reader": {
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/@emmetio/stream-reader/-/stream-reader-2.2.0.tgz",
"integrity": "sha512-fXVXEyFA5Yv3M3n8sUGT7+fvecGrZP4k6FnWWMSZVQf69kAq0LLpaBQLGcPR30m3zMmKYhECP4k/ZkzvhEW5kw==",
"dev": true,
"license": "MIT"
},
"node_modules/@emmetio/stream-reader-utils": {
"version": "0.1.0",
"resolved": "https://registry.npmjs.org/@emmetio/stream-reader-utils/-/stream-reader-utils-0.1.0.tgz",
"integrity": "sha512-ZsZ2I9Vzso3Ho/pjZFsmmZ++FWeEd/txqybHTm4OgaZzdS8V9V/YYWQwg5TC38Z7uLWUV1vavpLLbjJtKubR1A==",
"dev": true,
"license": "MIT"
},
"node_modules/@emnapi/runtime": { "node_modules/@emnapi/runtime": {
"version": "1.9.1", "version": "1.9.1",
"resolved": "https://registry.npmjs.org/@emnapi/runtime/-/runtime-1.9.1.tgz", "resolved": "https://registry.npmjs.org/@emnapi/runtime/-/runtime-1.9.1.tgz",
@@ -1366,9 +1543,9 @@
} }
}, },
"node_modules/@hono/node-server": { "node_modules/@hono/node-server": {
"version": "1.19.11", "version": "1.19.13",
"resolved": "https://registry.npmjs.org/@hono/node-server/-/node-server-1.19.11.tgz", "resolved": "https://registry.npmjs.org/@hono/node-server/-/node-server-1.19.13.tgz",
"integrity": "sha512-dr8/3zEaB+p0D2n/IUrlPF1HZm586qgJNXK1a9fhg/PzdtkK7Ksd5l312tJX2yBuALqDYBlG20QEbayqPyxn+g==", "integrity": "sha512-TsQLe4i2gvoTtrHje625ngThGBySOgSK3Xo2XRYOdqGN1teR8+I7vchQC46uLJi8OF62YTYA3AhSpumtkhsaKQ==",
"license": "MIT", "license": "MIT",
"engines": { "engines": {
"node": ">=18.14.1" "node": ">=18.14.1"
@@ -4484,27 +4661,6 @@
"path-browserify": "^1.0.1" "path-browserify": "^1.0.1"
} }
}, },
"node_modules/@ts-morph/common/node_modules/balanced-match": {
"version": "4.0.4",
"resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-4.0.4.tgz",
"integrity": "sha512-BLrgEcRTwX2o6gGxGOCNyMvGSp35YofuYzw9h1IMTRmKqttAZZVU67bdb9Pr2vUHA8+j3i2tJfjO6C6+4myGTA==",
"license": "MIT",
"engines": {
"node": "18 || 20 || >=22"
}
},
"node_modules/@ts-morph/common/node_modules/brace-expansion": {
"version": "5.0.5",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-5.0.5.tgz",
"integrity": "sha512-VZznLgtwhn+Mact9tfiwx64fA9erHH/MCXEUfB/0bX/6Fz6ny5EGTXYltMocqg4xFAQZtnO3DHWWXi8RiuN7cQ==",
"license": "MIT",
"dependencies": {
"balanced-match": "^4.0.2"
},
"engines": {
"node": "18 || 20 || >=22"
}
},
"node_modules/@ts-morph/common/node_modules/minimatch": { "node_modules/@ts-morph/common/node_modules/minimatch": {
"version": "10.2.4", "version": "10.2.4",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-10.2.4.tgz", "resolved": "https://registry.npmjs.org/minimatch/-/minimatch-10.2.4.tgz",
@@ -4520,6 +4676,22 @@
"url": "https://github.com/sponsors/isaacs" "url": "https://github.com/sponsors/isaacs"
} }
}, },
"node_modules/@ts-morph/common/node_modules/minimatch/node_modules/balanced-match": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz",
"integrity": "sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw==",
"license": "MIT"
},
"node_modules/@ts-morph/common/node_modules/minimatch/node_modules/brace-expansion": {
"version": "1.1.13",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.13.tgz",
"integrity": "sha512-9ZLprWS6EENmhEOpjCYW2c8VkmOvckIJZfkr7rBW6dObmfgJ/L1GpSYW5Hpo9lDz4D1+n0Ckz8rU7FwHDQiG/w==",
"license": "MIT",
"dependencies": {
"balanced-match": "^1.0.0",
"concat-map": "0.0.1"
}
},
"node_modules/@types/babel__core": { "node_modules/@types/babel__core": {
"version": "7.20.5", "version": "7.20.5",
"resolved": "https://registry.npmjs.org/@types/babel__core/-/babel__core-7.20.5.tgz", "resolved": "https://registry.npmjs.org/@types/babel__core/-/babel__core-7.20.5.tgz",
@@ -4840,29 +5012,6 @@
"typescript": ">=4.8.4 <6.0.0" "typescript": ">=4.8.4 <6.0.0"
} }
}, },
"node_modules/@typescript-eslint/typescript-estree/node_modules/balanced-match": {
"version": "4.0.4",
"resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-4.0.4.tgz",
"integrity": "sha512-BLrgEcRTwX2o6gGxGOCNyMvGSp35YofuYzw9h1IMTRmKqttAZZVU67bdb9Pr2vUHA8+j3i2tJfjO6C6+4myGTA==",
"dev": true,
"license": "MIT",
"engines": {
"node": "18 || 20 || >=22"
}
},
"node_modules/@typescript-eslint/typescript-estree/node_modules/brace-expansion": {
"version": "5.0.5",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-5.0.5.tgz",
"integrity": "sha512-VZznLgtwhn+Mact9tfiwx64fA9erHH/MCXEUfB/0bX/6Fz6ny5EGTXYltMocqg4xFAQZtnO3DHWWXi8RiuN7cQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"balanced-match": "^4.0.2"
},
"engines": {
"node": "18 || 20 || >=22"
}
},
"node_modules/@typescript-eslint/typescript-estree/node_modules/minimatch": { "node_modules/@typescript-eslint/typescript-estree/node_modules/minimatch": {
"version": "10.2.4", "version": "10.2.4",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-10.2.4.tgz", "resolved": "https://registry.npmjs.org/minimatch/-/minimatch-10.2.4.tgz",
@@ -4879,6 +5028,24 @@
"url": "https://github.com/sponsors/isaacs" "url": "https://github.com/sponsors/isaacs"
} }
}, },
"node_modules/@typescript-eslint/typescript-estree/node_modules/minimatch/node_modules/balanced-match": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-1.0.2.tgz",
"integrity": "sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw==",
"dev": true,
"license": "MIT"
},
"node_modules/@typescript-eslint/typescript-estree/node_modules/minimatch/node_modules/brace-expansion": {
"version": "1.1.13",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.13.tgz",
"integrity": "sha512-9ZLprWS6EENmhEOpjCYW2c8VkmOvckIJZfkr7rBW6dObmfgJ/L1GpSYW5Hpo9lDz4D1+n0Ckz8rU7FwHDQiG/w==",
"dev": true,
"license": "MIT",
"dependencies": {
"balanced-match": "^1.0.0",
"concat-map": "0.0.1"
}
},
"node_modules/@typescript-eslint/typescript-estree/node_modules/semver": { "node_modules/@typescript-eslint/typescript-estree/node_modules/semver": {
"version": "7.7.4", "version": "7.7.4",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.7.4.tgz", "resolved": "https://registry.npmjs.org/semver/-/semver-7.7.4.tgz",
@@ -4973,6 +5140,104 @@
"vite": "^4.2.0 || ^5.0.0 || ^6.0.0 || ^7.0.0" "vite": "^4.2.0 || ^5.0.0 || ^6.0.0 || ^7.0.0"
} }
}, },
"node_modules/@volar/kit": {
"version": "2.4.28",
"resolved": "https://registry.npmjs.org/@volar/kit/-/kit-2.4.28.tgz",
"integrity": "sha512-cKX4vK9dtZvDRaAzeoUdaAJEew6IdxHNCRrdp5Kvcl6zZOqb6jTOfk3kXkIkG3T7oTFXguEMt5+9ptyqYR84Pg==",
"dev": true,
"license": "MIT",
"dependencies": {
"@volar/language-service": "2.4.28",
"@volar/typescript": "2.4.28",
"typesafe-path": "^0.2.2",
"vscode-languageserver-textdocument": "^1.0.11",
"vscode-uri": "^3.0.8"
},
"peerDependencies": {
"typescript": "*"
}
},
"node_modules/@volar/language-core": {
"version": "2.4.28",
"resolved": "https://registry.npmjs.org/@volar/language-core/-/language-core-2.4.28.tgz",
"integrity": "sha512-w4qhIJ8ZSitgLAkVay6AbcnC7gP3glYM3fYwKV3srj8m494E3xtrCv6E+bWviiK/8hs6e6t1ij1s2Endql7vzQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"@volar/source-map": "2.4.28"
}
},
"node_modules/@volar/language-server": {
"version": "2.4.28",
"resolved": "https://registry.npmjs.org/@volar/language-server/-/language-server-2.4.28.tgz",
"integrity": "sha512-NqcLnE5gERKuS4PUFwlhMxf6vqYo7hXtbMFbViXcbVkbZ905AIVWhnSo0ZNBC2V127H1/2zP7RvVOVnyITFfBw==",
"dev": true,
"license": "MIT",
"dependencies": {
"@volar/language-core": "2.4.28",
"@volar/language-service": "2.4.28",
"@volar/typescript": "2.4.28",
"path-browserify": "^1.0.1",
"request-light": "^0.7.0",
"vscode-languageserver": "^9.0.1",
"vscode-languageserver-protocol": "^3.17.5",
"vscode-languageserver-textdocument": "^1.0.11",
"vscode-uri": "^3.0.8"
}
},
"node_modules/@volar/language-service": {
"version": "2.4.28",
"resolved": "https://registry.npmjs.org/@volar/language-service/-/language-service-2.4.28.tgz",
"integrity": "sha512-Rh/wYCZJrI5vCwMk9xyw/Z+MsWxlJY1rmMZPsxUoJKfzIRjS/NF1NmnuEcrMbEVGja00aVpCsInJfixQTMdvLw==",
"dev": true,
"license": "MIT",
"dependencies": {
"@volar/language-core": "2.4.28",
"vscode-languageserver-protocol": "^3.17.5",
"vscode-languageserver-textdocument": "^1.0.11",
"vscode-uri": "^3.0.8"
}
},
"node_modules/@volar/source-map": {
"version": "2.4.28",
"resolved": "https://registry.npmjs.org/@volar/source-map/-/source-map-2.4.28.tgz",
"integrity": "sha512-yX2BDBqJkRXfKw8my8VarTyjv48QwxdJtvRgUpNE5erCsgEUdI2DsLbpa+rOQVAJYshY99szEcRDmyHbF10ggQ==",
"dev": true,
"license": "MIT"
},
"node_modules/@volar/typescript": {
"version": "2.4.28",
"resolved": "https://registry.npmjs.org/@volar/typescript/-/typescript-2.4.28.tgz",
"integrity": "sha512-Ja6yvWrbis2QtN4ClAKreeUZPVYMARDYZl9LMEv1iQ1QdepB6wn0jTRxA9MftYmYa4DQ4k/DaSZpFPUfxl8giw==",
"dev": true,
"license": "MIT",
"dependencies": {
"@volar/language-core": "2.4.28",
"path-browserify": "^1.0.1",
"vscode-uri": "^3.0.8"
}
},
"node_modules/@vscode/emmet-helper": {
"version": "2.11.0",
"resolved": "https://registry.npmjs.org/@vscode/emmet-helper/-/emmet-helper-2.11.0.tgz",
"integrity": "sha512-QLxjQR3imPZPQltfbWRnHU6JecWTF1QSWhx3GAKQpslx7y3Dp6sIIXhKjiUJ/BR9FX8PVthjr9PD6pNwOJfAzw==",
"dev": true,
"license": "MIT",
"dependencies": {
"emmet": "^2.4.3",
"jsonc-parser": "^2.3.0",
"vscode-languageserver-textdocument": "^1.0.1",
"vscode-languageserver-types": "^3.15.1",
"vscode-uri": "^3.0.8"
}
},
"node_modules/@vscode/l10n": {
"version": "0.0.18",
"resolved": "https://registry.npmjs.org/@vscode/l10n/-/l10n-0.0.18.tgz",
"integrity": "sha512-KYSIHVmslkaCDyw013pphY+d7x1qV8IZupYfeIfzNA+nsaWHbn5uPuQRvdRFsa9zFzGeudPuoGoZ1Op4jrJXIQ==",
"dev": true,
"license": "MIT"
},
"node_modules/accepts": { "node_modules/accepts": {
"version": "2.0.0", "version": "2.0.0",
"resolved": "https://registry.npmjs.org/accepts/-/accepts-2.0.0.tgz", "resolved": "https://registry.npmjs.org/accepts/-/accepts-2.0.0.tgz",
@@ -5416,9 +5681,9 @@
} }
}, },
"node_modules/brace-expansion": { "node_modules/brace-expansion": {
"version": "1.1.12", "version": "1.1.13",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.12.tgz", "resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.13.tgz",
"integrity": "sha512-9T9UjW3r0UW5c1Q7GTwllptXwhvYmEzFhzMfZ9H7FQWt+uZePjZPjBP/W1ZEyZ1twGWom5/56TF4lPcqjnDHcg==", "integrity": "sha512-9ZLprWS6EENmhEOpjCYW2c8VkmOvckIJZfkr7rBW6dObmfgJ/L1GpSYW5Hpo9lDz4D1+n0Ckz8rU7FwHDQiG/w==",
"dev": true, "dev": true,
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
@@ -5856,7 +6121,6 @@
"version": "0.0.1", "version": "0.0.1",
"resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz", "resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz",
"integrity": "sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg==", "integrity": "sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg==",
"dev": true,
"license": "MIT" "license": "MIT"
}, },
"node_modules/content-disposition": { "node_modules/content-disposition": {
@@ -6183,9 +6447,9 @@
} }
}, },
"node_modules/defu": { "node_modules/defu": {
"version": "6.1.4", "version": "6.1.7",
"resolved": "https://registry.npmjs.org/defu/-/defu-6.1.4.tgz", "resolved": "https://registry.npmjs.org/defu/-/defu-6.1.7.tgz",
"integrity": "sha512-mEQCMmwJu317oSz8CwdIOdwf3xMif1ttiM8LTufzc3g6kR+9Pe236twL8j3IYT1F7GfRgGcW6MWxzZjLIkuHIg==", "integrity": "sha512-7z22QmUWiQ/2d0KkdYmANbRUVABpZ9SNYyH5vx6PZ+nE5bcC0l7uFvEfHlyld/HcGBFTL536ClDt3DEcSlEJAQ==",
"license": "MIT" "license": "MIT"
}, },
"node_modules/depd": { "node_modules/depd": {
@@ -6404,6 +6668,23 @@
"integrity": "sha512-vFU34OcrvMcH66T+dYC3G4nURmgfDVewMIu6Q2urXpumAPSMmzvcn04KVVV8Opikq8Vs5nUbO/8laNhNRqSzYw==", "integrity": "sha512-vFU34OcrvMcH66T+dYC3G4nURmgfDVewMIu6Q2urXpumAPSMmzvcn04KVVV8Opikq8Vs5nUbO/8laNhNRqSzYw==",
"license": "ISC" "license": "ISC"
}, },
"node_modules/emmet": {
"version": "2.4.11",
"resolved": "https://registry.npmjs.org/emmet/-/emmet-2.4.11.tgz",
"integrity": "sha512-23QPJB3moh/U9sT4rQzGgeyyGIrcM+GH5uVYg2C6wZIxAIJq7Ng3QLT79tl8FUwDXhyq9SusfknOrofAKqvgyQ==",
"dev": true,
"license": "MIT",
"workspaces": [
"./packages/scanner",
"./packages/abbreviation",
"./packages/css-abbreviation",
"./"
],
"dependencies": {
"@emmetio/abbreviation": "^2.3.3",
"@emmetio/css-abbreviation": "^2.1.8"
}
},
"node_modules/emoji-regex": { "node_modules/emoji-regex": {
"version": "10.6.0", "version": "10.6.0",
"resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-10.6.0.tgz", "resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-10.6.0.tgz",
@@ -7674,9 +7955,9 @@
} }
}, },
"node_modules/hono": { "node_modules/hono": {
"version": "4.12.9", "version": "4.12.12",
"resolved": "https://registry.npmjs.org/hono/-/hono-4.12.9.tgz", "resolved": "https://registry.npmjs.org/hono/-/hono-4.12.12.tgz",
"integrity": "sha512-wy3T8Zm2bsEvxKZM5w21VdHDDcwVS1yUFFY6i8UobSsKfFceT7TOwhbhfKsDyx7tYQlmRM5FLpIuYvNFyjctiA==", "integrity": "sha512-p1JfQMKaceuCbpJKAPKVqyqviZdS0eUxH9v82oWo1kb9xjQ5wA6iP3FNVAPDFlz5/p7d45lO+BpSk1tuSZMF4Q==",
"license": "MIT", "license": "MIT",
"engines": { "engines": {
"node": ">=16.9.0" "node": ">=16.9.0"
@@ -8128,6 +8409,13 @@
"node": ">=6" "node": ">=6"
} }
}, },
"node_modules/jsonc-parser": {
"version": "2.3.1",
"resolved": "https://registry.npmjs.org/jsonc-parser/-/jsonc-parser-2.3.1.tgz",
"integrity": "sha512-H8jvkz1O50L3dMZCsLqiuB2tA7muqbSg1AtGEkN0leAqGjsUzDJir3Zwr02BhqdcITPg3ei3mZ+HjMocAknhhg==",
"dev": true,
"license": "MIT"
},
"node_modules/jsonfile": { "node_modules/jsonfile": {
"version": "6.2.0", "version": "6.2.0",
"resolved": "https://registry.npmjs.org/jsonfile/-/jsonfile-6.2.0.tgz", "resolved": "https://registry.npmjs.org/jsonfile/-/jsonfile-6.2.0.tgz",
@@ -9555,6 +9843,13 @@
"url": "https://github.com/sponsors/sindresorhus" "url": "https://github.com/sponsors/sindresorhus"
} }
}, },
"node_modules/muggle-string": {
"version": "0.4.1",
"resolved": "https://registry.npmjs.org/muggle-string/-/muggle-string-0.4.1.tgz",
"integrity": "sha512-VNTrAak/KhO2i8dqqnqnAHOa3cYBwXEZe9h+D5h/1ZqFSTEFHdM65lR7RoIqq3tBBYavsOXV84NoHXZ0AkPyqQ==",
"dev": true,
"license": "MIT"
},
"node_modules/mute-stream": { "node_modules/mute-stream": {
"version": "2.0.0", "version": "2.0.0",
"resolved": "https://registry.npmjs.org/mute-stream/-/mute-stream-2.0.0.tgz", "resolved": "https://registry.npmjs.org/mute-stream/-/mute-stream-2.0.0.tgz",
@@ -10810,6 +11105,13 @@
"url": "https://opencollective.com/unified" "url": "https://opencollective.com/unified"
} }
}, },
"node_modules/request-light": {
"version": "0.7.0",
"resolved": "https://registry.npmjs.org/request-light/-/request-light-0.7.0.tgz",
"integrity": "sha512-lMbBMrDoxgsyO+yB3sDcrDuX85yYt7sS8BfQd11jtbW/z5ZWgLZRcEGLsLoYw7I0WSUGQBs8CC8ScIxkTX1+6Q==",
"dev": true,
"license": "MIT"
},
"node_modules/require-directory": { "node_modules/require-directory": {
"version": "2.1.1", "version": "2.1.1",
"resolved": "https://registry.npmjs.org/require-directory/-/require-directory-2.1.1.tgz", "resolved": "https://registry.npmjs.org/require-directory/-/require-directory-2.1.1.tgz",
@@ -10991,9 +11293,9 @@
} }
}, },
"node_modules/router/node_modules/path-to-regexp": { "node_modules/router/node_modules/path-to-regexp": {
"version": "8.3.0", "version": "8.4.2",
"resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-8.3.0.tgz", "resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-8.4.2.tgz",
"integrity": "sha512-7jdwVIRtsP8MYpdXSwOS0YdD0Du+qOoF/AEPIt88PcCFrZCzx41oxku1jD88hZBwbNUIEfpqvuhjFaMAqMTWnA==", "integrity": "sha512-qRcuIdP69NPm4qbACK+aDogI5CBDMi1jKe0ry5rSQJz8JVLsC7jV8XpiJjGRLLol3N+R5ihGYcrPLTno6pAdBA==",
"license": "MIT", "license": "MIT",
"funding": { "funding": {
"type": "opencollective", "type": "opencollective",
@@ -11853,6 +12155,13 @@
"node": ">= 0.6" "node": ">= 0.6"
} }
}, },
"node_modules/typesafe-path": {
"version": "0.2.2",
"resolved": "https://registry.npmjs.org/typesafe-path/-/typesafe-path-0.2.2.tgz",
"integrity": "sha512-OJabfkAg1WLZSqJAJ0Z6Sdt3utnbzr/jh+NAHoyWHJe8CMSy79Gm085094M9nvTPy22KzTVn5Zq5mbapCI/hPA==",
"dev": true,
"license": "MIT"
},
"node_modules/typescript": { "node_modules/typescript": {
"version": "5.9.3", "version": "5.9.3",
"resolved": "https://registry.npmjs.org/typescript/-/typescript-5.9.3.tgz", "resolved": "https://registry.npmjs.org/typescript/-/typescript-5.9.3.tgz",
@@ -11866,6 +12175,29 @@
"node": ">=14.17" "node": ">=14.17"
} }
}, },
"node_modules/typescript-auto-import-cache": {
"version": "0.3.6",
"resolved": "https://registry.npmjs.org/typescript-auto-import-cache/-/typescript-auto-import-cache-0.3.6.tgz",
"integrity": "sha512-RpuHXrknHdVdK7wv/8ug3Fr0WNsNi5l5aB8MYYuXhq2UH5lnEB1htJ1smhtD5VeCsGr2p8mUDtd83LCQDFVgjQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"semver": "^7.3.8"
}
},
"node_modules/typescript-auto-import-cache/node_modules/semver": {
"version": "7.7.4",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.7.4.tgz",
"integrity": "sha512-vFKC2IEtQnVhpT78h1Yp8wzwrf8CM+MzKMHGJZfBtzhZNycRFnXsHk6E5TxIkkMsgNS7mdX3AGB7x2QM2di4lA==",
"dev": true,
"license": "ISC",
"bin": {
"semver": "bin/semver.js"
},
"engines": {
"node": ">=10"
}
},
"node_modules/typescript-eslint": { "node_modules/typescript-eslint": {
"version": "8.57.2", "version": "8.57.2",
"resolved": "https://registry.npmjs.org/typescript-eslint/-/typescript-eslint-8.57.2.tgz", "resolved": "https://registry.npmjs.org/typescript-eslint/-/typescript-eslint-8.57.2.tgz",
@@ -12364,9 +12696,9 @@
} }
}, },
"node_modules/vite": { "node_modules/vite": {
"version": "6.4.1", "version": "6.4.2",
"resolved": "https://registry.npmjs.org/vite/-/vite-6.4.1.tgz", "resolved": "https://registry.npmjs.org/vite/-/vite-6.4.2.tgz",
"integrity": "sha512-+Oxm7q9hDoLMyJOYfUYBuHQo+dkAloi33apOPP56pzj+vsdJDzr+j1NISE5pyaAuKL4A3UD34qd0lx5+kfKp2g==", "integrity": "sha512-2N/55r4JDJ4gdrCvGgINMy+HH3iRpNIz8K6SFwVsA+JbQScLiC+clmAxBgwiSPgcG9U15QmvqCGWzMbqda5zGQ==",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"esbuild": "^0.25.0", "esbuild": "^0.25.0",
@@ -12913,6 +13245,274 @@
} }
} }
}, },
"node_modules/volar-service-css": {
"version": "0.0.70",
"resolved": "https://registry.npmjs.org/volar-service-css/-/volar-service-css-0.0.70.tgz",
"integrity": "sha512-K1qyOvBpE3rzdAv3e4/6Rv5yizrYPy5R/ne3IWCAzLBuMO4qBMV3kSqWzj6KUVe6S0AnN6wxF7cRkiaKfYMYJw==",
"dev": true,
"license": "MIT",
"dependencies": {
"vscode-css-languageservice": "^6.3.0",
"vscode-languageserver-textdocument": "^1.0.11",
"vscode-uri": "^3.0.8"
},
"peerDependencies": {
"@volar/language-service": "~2.4.0"
},
"peerDependenciesMeta": {
"@volar/language-service": {
"optional": true
}
}
},
"node_modules/volar-service-emmet": {
"version": "0.0.70",
"resolved": "https://registry.npmjs.org/volar-service-emmet/-/volar-service-emmet-0.0.70.tgz",
"integrity": "sha512-xi5bC4m/VyE3zy/n2CXspKeDZs3qA41tHLTw275/7dNWM/RqE2z3BnDICQybHIVp/6G1iOQj5c1qXMgQC08TNg==",
"dev": true,
"license": "MIT",
"dependencies": {
"@emmetio/css-parser": "^0.4.1",
"@emmetio/html-matcher": "^1.3.0",
"@vscode/emmet-helper": "^2.9.3",
"vscode-uri": "^3.0.8"
},
"peerDependencies": {
"@volar/language-service": "~2.4.0"
},
"peerDependenciesMeta": {
"@volar/language-service": {
"optional": true
}
}
},
"node_modules/volar-service-html": {
"version": "0.0.70",
"resolved": "https://registry.npmjs.org/volar-service-html/-/volar-service-html-0.0.70.tgz",
"integrity": "sha512-eR6vCgMdmYAo4n+gcT7DSyBQbwB8S3HZZvSagTf0sxNaD4WppMCFfpqWnkrlGStPKMZvMiejRRVmqsX9dYcTvQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"vscode-html-languageservice": "^5.3.0",
"vscode-languageserver-textdocument": "^1.0.11",
"vscode-uri": "^3.0.8"
},
"peerDependencies": {
"@volar/language-service": "~2.4.0"
},
"peerDependenciesMeta": {
"@volar/language-service": {
"optional": true
}
}
},
"node_modules/volar-service-prettier": {
"version": "0.0.70",
"resolved": "https://registry.npmjs.org/volar-service-prettier/-/volar-service-prettier-0.0.70.tgz",
"integrity": "sha512-Z6BCFSpGVCd8BPAsZ785Kce1BGlWd5ODqmqZGVuB14MJvrR4+CYz6cDy4F+igmE1gMifqfvMhdgT8Aud4M5ngg==",
"dev": true,
"license": "MIT",
"dependencies": {
"vscode-uri": "^3.0.8"
},
"peerDependencies": {
"@volar/language-service": "~2.4.0",
"prettier": "^2.2 || ^3.0"
},
"peerDependenciesMeta": {
"@volar/language-service": {
"optional": true
},
"prettier": {
"optional": true
}
}
},
"node_modules/volar-service-typescript": {
"version": "0.0.70",
"resolved": "https://registry.npmjs.org/volar-service-typescript/-/volar-service-typescript-0.0.70.tgz",
"integrity": "sha512-l46Bx4cokkUedTd74ojO5H/zqHZJ8SUuyZ0IB8JN4jfRqUM3bQFBHoOwlZCyZmOeO0A3RQNkMnFclxO4c++gsg==",
"dev": true,
"license": "MIT",
"dependencies": {
"path-browserify": "^1.0.1",
"semver": "^7.6.2",
"typescript-auto-import-cache": "^0.3.5",
"vscode-languageserver-textdocument": "^1.0.11",
"vscode-nls": "^5.2.0",
"vscode-uri": "^3.0.8"
},
"peerDependencies": {
"@volar/language-service": "~2.4.0"
},
"peerDependenciesMeta": {
"@volar/language-service": {
"optional": true
}
}
},
"node_modules/volar-service-typescript-twoslash-queries": {
"version": "0.0.70",
"resolved": "https://registry.npmjs.org/volar-service-typescript-twoslash-queries/-/volar-service-typescript-twoslash-queries-0.0.70.tgz",
"integrity": "sha512-IdD13Z9N2Bu8EM6CM0fDV1E69olEYGHDU25X51YXmq8Y0CmJ2LNj6gOiBJgpS5JGUqFzECVhMNBW7R0sPdRTMQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"vscode-uri": "^3.0.8"
},
"peerDependencies": {
"@volar/language-service": "~2.4.0"
},
"peerDependenciesMeta": {
"@volar/language-service": {
"optional": true
}
}
},
"node_modules/volar-service-typescript/node_modules/semver": {
"version": "7.7.4",
"resolved": "https://registry.npmjs.org/semver/-/semver-7.7.4.tgz",
"integrity": "sha512-vFKC2IEtQnVhpT78h1Yp8wzwrf8CM+MzKMHGJZfBtzhZNycRFnXsHk6E5TxIkkMsgNS7mdX3AGB7x2QM2di4lA==",
"dev": true,
"license": "ISC",
"bin": {
"semver": "bin/semver.js"
},
"engines": {
"node": ">=10"
}
},
"node_modules/volar-service-yaml": {
"version": "0.0.70",
"resolved": "https://registry.npmjs.org/volar-service-yaml/-/volar-service-yaml-0.0.70.tgz",
"integrity": "sha512-0c8bXDBeoATF9F6iPIlOuYTuZAC4c+yi0siQo920u7eiBJk8oQmUmg9cDUbR4+Gl++bvGP4plj3fErbJuPqdcQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"vscode-uri": "^3.0.8",
"yaml-language-server": "~1.20.0"
},
"peerDependencies": {
"@volar/language-service": "~2.4.0"
},
"peerDependenciesMeta": {
"@volar/language-service": {
"optional": true
}
}
},
"node_modules/vscode-css-languageservice": {
"version": "6.3.10",
"resolved": "https://registry.npmjs.org/vscode-css-languageservice/-/vscode-css-languageservice-6.3.10.tgz",
"integrity": "sha512-eq5N9Er3fC4vA9zd9EFhyBG90wtCCuXgRSpAndaOgXMh1Wgep5lBgRIeDgjZBW9pa+332yC9+49cZMW8jcL3MA==",
"dev": true,
"license": "MIT",
"dependencies": {
"@vscode/l10n": "^0.0.18",
"vscode-languageserver-textdocument": "^1.0.12",
"vscode-languageserver-types": "3.17.5",
"vscode-uri": "^3.1.0"
}
},
"node_modules/vscode-html-languageservice": {
"version": "5.6.2",
"resolved": "https://registry.npmjs.org/vscode-html-languageservice/-/vscode-html-languageservice-5.6.2.tgz",
"integrity": "sha512-ulCrSnFnfQ16YzvwnYUgEbUEl/ZG7u2eV27YhvLObSHKkb8fw1Z9cgsnUwjTEeDIdJDoTDTDpxuhQwoenoLNMg==",
"dev": true,
"license": "MIT",
"dependencies": {
"@vscode/l10n": "^0.0.18",
"vscode-languageserver-textdocument": "^1.0.12",
"vscode-languageserver-types": "^3.17.5",
"vscode-uri": "^3.1.0"
}
},
"node_modules/vscode-json-languageservice": {
"version": "4.1.8",
"resolved": "https://registry.npmjs.org/vscode-json-languageservice/-/vscode-json-languageservice-4.1.8.tgz",
"integrity": "sha512-0vSpg6Xd9hfV+eZAaYN63xVVMOTmJ4GgHxXnkLCh+9RsQBkWKIghzLhW2B9ebfG+LQQg8uLtsQ2aUKjTgE+QOg==",
"dev": true,
"license": "MIT",
"dependencies": {
"jsonc-parser": "^3.0.0",
"vscode-languageserver-textdocument": "^1.0.1",
"vscode-languageserver-types": "^3.16.0",
"vscode-nls": "^5.0.0",
"vscode-uri": "^3.0.2"
},
"engines": {
"npm": ">=7.0.0"
}
},
"node_modules/vscode-json-languageservice/node_modules/jsonc-parser": {
"version": "3.3.1",
"resolved": "https://registry.npmjs.org/jsonc-parser/-/jsonc-parser-3.3.1.tgz",
"integrity": "sha512-HUgH65KyejrUFPvHFPbqOY0rsFip3Bo5wb4ngvdi1EpCYWUQDC5V+Y7mZws+DLkr4M//zQJoanu1SP+87Dv1oQ==",
"dev": true,
"license": "MIT"
},
"node_modules/vscode-jsonrpc": {
"version": "8.2.0",
"resolved": "https://registry.npmjs.org/vscode-jsonrpc/-/vscode-jsonrpc-8.2.0.tgz",
"integrity": "sha512-C+r0eKJUIfiDIfwJhria30+TYWPtuHJXHtI7J0YlOmKAo7ogxP20T0zxB7HZQIFhIyvoBPwWskjxrvAtfjyZfA==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=14.0.0"
}
},
"node_modules/vscode-languageserver": {
"version": "9.0.1",
"resolved": "https://registry.npmjs.org/vscode-languageserver/-/vscode-languageserver-9.0.1.tgz",
"integrity": "sha512-woByF3PDpkHFUreUa7Hos7+pUWdeWMXRd26+ZX2A8cFx6v/JPTtd4/uN0/jB6XQHYaOlHbio03NTHCqrgG5n7g==",
"dev": true,
"license": "MIT",
"dependencies": {
"vscode-languageserver-protocol": "3.17.5"
},
"bin": {
"installServerIntoExtension": "bin/installServerIntoExtension"
}
},
"node_modules/vscode-languageserver-protocol": {
"version": "3.17.5",
"resolved": "https://registry.npmjs.org/vscode-languageserver-protocol/-/vscode-languageserver-protocol-3.17.5.tgz",
"integrity": "sha512-mb1bvRJN8SVznADSGWM9u/b07H7Ecg0I3OgXDuLdn307rl/J3A9YD6/eYOssqhecL27hK1IPZAsaqh00i/Jljg==",
"dev": true,
"license": "MIT",
"dependencies": {
"vscode-jsonrpc": "8.2.0",
"vscode-languageserver-types": "3.17.5"
}
},
"node_modules/vscode-languageserver-textdocument": {
"version": "1.0.12",
"resolved": "https://registry.npmjs.org/vscode-languageserver-textdocument/-/vscode-languageserver-textdocument-1.0.12.tgz",
"integrity": "sha512-cxWNPesCnQCcMPeenjKKsOCKQZ/L6Tv19DTRIGuLWe32lyzWhihGVJ/rcckZXJxfdKCFvRLS3fpBIsV/ZGX4zA==",
"dev": true,
"license": "MIT"
},
"node_modules/vscode-languageserver-types": {
"version": "3.17.5",
"resolved": "https://registry.npmjs.org/vscode-languageserver-types/-/vscode-languageserver-types-3.17.5.tgz",
"integrity": "sha512-Ld1VelNuX9pdF39h2Hgaeb5hEZM2Z3jUrrMgWQAu82jMtZp7p3vJT3BzToKtZI7NgQssZje5o0zryOrhQvzQAg==",
"dev": true,
"license": "MIT"
},
"node_modules/vscode-nls": {
"version": "5.2.0",
"resolved": "https://registry.npmjs.org/vscode-nls/-/vscode-nls-5.2.0.tgz",
"integrity": "sha512-RAaHx7B14ZU04EU31pT+rKz2/zSl7xMsfIZuo8pd+KZO6PXtQmpevpq3vxvWNcrGbdmhM/rr5Uw5Mz+NBfhVng==",
"dev": true,
"license": "MIT"
},
"node_modules/vscode-uri": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/vscode-uri/-/vscode-uri-3.1.0.tgz",
"integrity": "sha512-/BpdSx+yCQGnCvecbyXdxHDkuk55/G3xwnC0GqY4gmQ3j+A+g8kzzgB4Nk/SINjqn6+waqw3EgbVF2QKExkRxQ==",
"dev": true,
"license": "MIT"
},
"node_modules/web-namespaces": { "node_modules/web-namespaces": {
"version": "2.0.1", "version": "2.0.1",
"resolved": "https://registry.npmjs.org/web-namespaces/-/web-namespaces-2.0.1.tgz", "resolved": "https://registry.npmjs.org/web-namespaces/-/web-namespaces-2.0.1.tgz",
@@ -13041,6 +13641,91 @@
"integrity": "sha512-a4UGQaWPH59mOXUYnAG2ewncQS4i4F43Tv3JoAM+s2VDAmS9NsK8GpDMLrCHPksFT7h3K6TOoUNn2pb7RoXx4g==", "integrity": "sha512-a4UGQaWPH59mOXUYnAG2ewncQS4i4F43Tv3JoAM+s2VDAmS9NsK8GpDMLrCHPksFT7h3K6TOoUNn2pb7RoXx4g==",
"license": "ISC" "license": "ISC"
}, },
"node_modules/yaml": {
"version": "2.8.3",
"resolved": "https://registry.npmjs.org/yaml/-/yaml-2.8.3.tgz",
"integrity": "sha512-AvbaCLOO2Otw/lW5bmh9d/WEdcDFdQp2Z2ZUH3pX9U2ihyUY0nvLv7J6TrWowklRGPYbB/IuIMfYgxaCPg5Bpg==",
"devOptional": true,
"license": "ISC",
"bin": {
"yaml": "bin.mjs"
},
"engines": {
"node": ">= 14.6"
},
"funding": {
"url": "https://github.com/sponsors/eemeli"
}
},
"node_modules/yaml-language-server": {
"version": "1.20.0",
"resolved": "https://registry.npmjs.org/yaml-language-server/-/yaml-language-server-1.20.0.tgz",
"integrity": "sha512-qhjK/bzSRZ6HtTvgeFvjNPJGWdZ0+x5NREV/9XZWFjIGezew2b4r5JPy66IfOhd5OA7KeFwk1JfmEbnTvev0cA==",
"dev": true,
"license": "MIT",
"dependencies": {
"@vscode/l10n": "^0.0.18",
"ajv": "^8.17.1",
"ajv-draft-04": "^1.0.0",
"prettier": "^3.5.0",
"request-light": "^0.5.7",
"vscode-json-languageservice": "4.1.8",
"vscode-languageserver": "^9.0.0",
"vscode-languageserver-textdocument": "^1.0.1",
"vscode-languageserver-types": "^3.16.0",
"vscode-uri": "^3.0.2",
"yaml": "2.7.1"
},
"bin": {
"yaml-language-server": "bin/yaml-language-server"
}
},
"node_modules/yaml-language-server/node_modules/ajv": {
"version": "8.18.0",
"resolved": "https://registry.npmjs.org/ajv/-/ajv-8.18.0.tgz",
"integrity": "sha512-PlXPeEWMXMZ7sPYOHqmDyCJzcfNrUr3fGNKtezX14ykXOEIvyK81d+qydx89KY5O71FKMPaQ2vBfBFI5NHR63A==",
"dev": true,
"license": "MIT",
"dependencies": {
"fast-deep-equal": "^3.1.3",
"fast-uri": "^3.0.1",
"json-schema-traverse": "^1.0.0",
"require-from-string": "^2.0.2"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/epoberezkin"
}
},
"node_modules/yaml-language-server/node_modules/ajv-draft-04": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/ajv-draft-04/-/ajv-draft-04-1.0.0.tgz",
"integrity": "sha512-mv00Te6nmYbRp5DCwclxtt7yV/joXJPGS7nM+97GdxvuttCOfgI3K4U25zboyeX0O+myI8ERluxQe5wljMmVIw==",
"dev": true,
"license": "MIT",
"peerDependencies": {
"ajv": "^8.5.0"
},
"peerDependenciesMeta": {
"ajv": {
"optional": true
}
}
},
"node_modules/yaml-language-server/node_modules/json-schema-traverse": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-1.0.0.tgz",
"integrity": "sha512-NM8/P9n3XjXhIZn1lLhkFaACTOURQXjWhV4BA/RnOv8xvgqtqpAX9IO4mRQxSx1Rlo4tqzeqb0sOlruaOy3dug==",
"dev": true,
"license": "MIT"
},
"node_modules/yaml-language-server/node_modules/request-light": {
"version": "0.5.8",
"resolved": "https://registry.npmjs.org/request-light/-/request-light-0.5.8.tgz",
"integrity": "sha512-3Zjgh+8b5fhRJBQZoy+zbVKpAQGLyka0MPgW3zruTF4dFFJ8Fqcfu9YsAvi/rvdcaTeWG3MkbZv4WKxAn/84Lg==",
"dev": true,
"license": "MIT"
},
"node_modules/yargs": { "node_modules/yargs": {
"version": "17.7.2", "version": "17.7.2",
"resolved": "https://registry.npmjs.org/yargs/-/yargs-17.7.2.tgz", "resolved": "https://registry.npmjs.org/yargs/-/yargs-17.7.2.tgz",

View File

@@ -3,6 +3,9 @@
"type": "module", "type": "module",
"version": "0.0.1", "version": "0.0.1",
"private": true, "private": true,
"engines": {
"node": ">=20.19.0"
},
"scripts": { "scripts": {
"dev": "astro dev", "dev": "astro dev",
"build": "node ../scripts/sync-website-installers.mjs && astro build", "build": "node ../scripts/sync-website-installers.mjs && astro build",
@@ -30,7 +33,21 @@
"tailwindcss": "^4.2.1", "tailwindcss": "^4.2.1",
"tw-animate-css": "^1.4.0" "tw-animate-css": "^1.4.0"
}, },
"overrides": {
"@modelcontextprotocol/sdk": {
"@hono/node-server": "1.19.13",
"hono": "4.12.12"
},
"router": {
"path-to-regexp": "8.4.2"
},
"defu": "6.1.7",
"vite": "6.4.2",
"brace-expansion": "1.1.13",
"yaml": "2.8.3"
},
"devDependencies": { "devDependencies": {
"@astrojs/check": "^0.9.8",
"@eslint/js": "^9.39.4", "@eslint/js": "^9.39.4",
"eslint": "^9.39.4", "eslint": "^9.39.4",
"eslint-plugin-react-hooks": "^7.0.1", "eslint-plugin-react-hooks": "^7.0.1",

View File

@@ -2,7 +2,7 @@
set -eu set -eu
VERSION="${1:-edge}" VERSION="${1:-latest}"
INSTALL_BIN_DIR="${FEYNMAN_INSTALL_BIN_DIR:-$HOME/.local/bin}" INSTALL_BIN_DIR="${FEYNMAN_INSTALL_BIN_DIR:-$HOME/.local/bin}"
INSTALL_APP_DIR="${FEYNMAN_INSTALL_APP_DIR:-$HOME/.local/share/feynman}" INSTALL_APP_DIR="${FEYNMAN_INSTALL_APP_DIR:-$HOME/.local/share/feynman}"
SKIP_PATH_UPDATE="${FEYNMAN_INSTALL_SKIP_PATH_UPDATE:-0}" SKIP_PATH_UPDATE="${FEYNMAN_INSTALL_SKIP_PATH_UPDATE:-0}"
@@ -54,12 +54,16 @@ run_with_spinner() {
normalize_version() { normalize_version() {
case "$1" in case "$1" in
"" | edge) "")
printf 'edge\n' printf 'latest\n'
;; ;;
latest | stable) latest | stable)
printf 'latest\n' printf 'latest\n'
;; ;;
edge)
echo "The edge channel has been removed. Use the default installer for the latest tagged release or pass an exact version." >&2
exit 1
;;
v*) v*)
printf '%s\n' "${1#v}" printf '%s\n' "${1#v}"
;; ;;
@@ -160,39 +164,29 @@ require_command() {
fi fi
} }
resolve_release_metadata() { warn_command_conflict() {
normalized_version="$(normalize_version "$VERSION")" expected_path="$INSTALL_BIN_DIR/feynman"
resolved_path="$(command -v feynman 2>/dev/null || true)"
if [ "$normalized_version" = "edge" ]; then if [ -z "$resolved_path" ]; then
release_json="$(download_text "https://api.github.com/repos/getcompanion-ai/feynman/releases/tags/edge")"
asset_url=""
for candidate in $(printf '%s\n' "$release_json" | sed -n 's/.*"browser_download_url":[[:space:]]*"\([^"]*\)".*/\1/p'); do
case "$candidate" in
*/feynman-*-${asset_target}.${archive_extension})
asset_url="$candidate"
break
;;
esac
done
if [ -z "$asset_url" ]; then
echo "Failed to resolve the latest Feynman edge bundle." >&2
exit 1
fi
archive_name="${asset_url##*/}"
bundle_name="${archive_name%.$archive_extension}"
resolved_version="${bundle_name#feynman-}"
resolved_version="${resolved_version%-${asset_target}}"
printf '%s\n%s\n%s\n%s\n' "$resolved_version" "$bundle_name" "$archive_name" "$asset_url"
return return
fi fi
if [ "$resolved_path" != "$expected_path" ]; then
step "Warning: current shell resolves feynman to $resolved_path"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && hash -r && feynman"
step "Or launch directly: $expected_path"
step "If that path is an old package-manager install, remove it or put $INSTALL_BIN_DIR first on PATH."
fi
}
resolve_release_metadata() {
normalized_version="$(normalize_version "$VERSION")"
if [ "$normalized_version" = "latest" ]; then if [ "$normalized_version" = "latest" ]; then
release_json="$(download_text "https://api.github.com/repos/getcompanion-ai/feynman/releases/latest")" release_page="$(download_text "https://github.com/getcompanion-ai/feynman/releases/latest")"
resolved_version="$(printf '%s\n' "$release_json" | sed -n 's/.*"tag_name":[[:space:]]*"v\([^"]*\)".*/\1/p' | head -n 1)" resolved_version="$(printf '%s\n' "$release_page" | sed -n 's@.*releases/tag/v\([0-9][^"<>[:space:]]*\).*@\1@p' | head -n 1)"
if [ -z "$resolved_version" ]; then if [ -z "$resolved_version" ]; then
echo "Failed to resolve the latest Feynman release version." >&2 echo "Failed to resolve the latest Feynman release version." >&2
@@ -266,8 +260,8 @@ This usually means the release exists, but not all platform bundles were uploade
Workarounds: Workarounds:
- try again after the release finishes publishing - try again after the release finishes publishing
- install via pnpm instead: pnpm add -g @companion-ai/feynman - pass the latest published version explicitly, e.g.:
- install via bun instead: bun add -g @companion-ai/feynman curl -fsSL https://feynman.is/install | bash -s -- 0.2.16
EOF EOF
exit 1 exit 1
fi fi
@@ -290,20 +284,22 @@ add_to_path
case "$path_action" in case "$path_action" in
added) added)
step "PATH updated for future shells in $path_profile" step "PATH updated for future shells in $path_profile"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && feynman" step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && hash -r && feynman"
;; ;;
configured) configured)
step "PATH is already configured for future shells in $path_profile" step "PATH is already configured for future shells in $path_profile"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && feynman" step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && hash -r && feynman"
;; ;;
skipped) skipped)
step "PATH update skipped" step "PATH update skipped"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && feynman" step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && hash -r && feynman"
;; ;;
*) *)
step "$INSTALL_BIN_DIR is already on PATH" step "$INSTALL_BIN_DIR is already on PATH"
step "Run: feynman" step "Run: hash -r && feynman"
;; ;;
esac esac
warn_command_conflict
printf 'Feynman %s installed successfully.\n' "$resolved_version" printf 'Feynman %s installed successfully.\n' "$resolved_version"

View File

@@ -0,0 +1,210 @@
#!/bin/sh
set -eu
VERSION="latest"
SCOPE="${FEYNMAN_SKILLS_SCOPE:-user}"
TARGET_DIR="${FEYNMAN_SKILLS_DIR:-}"
step() {
printf '==> %s\n' "$1"
}
normalize_version() {
case "$1" in
"")
printf 'latest\n'
;;
latest | stable)
printf 'latest\n'
;;
edge)
echo "The edge channel has been removed. Use the default installer for the latest tagged release or pass an exact version." >&2
exit 1
;;
v*)
printf '%s\n' "${1#v}"
;;
*)
printf '%s\n' "$1"
;;
esac
}
download_file() {
url="$1"
output="$2"
if command -v curl >/dev/null 2>&1; then
if [ -t 2 ]; then
curl -fL --progress-bar "$url" -o "$output"
else
curl -fsSL "$url" -o "$output"
fi
return
fi
if command -v wget >/dev/null 2>&1; then
if [ -t 2 ]; then
wget --show-progress -O "$output" "$url"
else
wget -q -O "$output" "$url"
fi
return
fi
echo "curl or wget is required to install Feynman skills." >&2
exit 1
}
download_text() {
url="$1"
if command -v curl >/dev/null 2>&1; then
curl -fsSL "$url"
return
fi
if command -v wget >/dev/null 2>&1; then
wget -q -O - "$url"
return
fi
echo "curl or wget is required to install Feynman skills." >&2
exit 1
}
resolve_version() {
normalized_version="$(normalize_version "$VERSION")"
if [ "$normalized_version" = "latest" ]; then
release_page="$(download_text "https://github.com/getcompanion-ai/feynman/releases/latest")"
resolved_version="$(printf '%s\n' "$release_page" | sed -n 's@.*releases/tag/v\([0-9][^"<>[:space:]]*\).*@\1@p' | head -n 1)"
if [ -z "$resolved_version" ]; then
echo "Failed to resolve the latest Feynman release version." >&2
exit 1
fi
printf '%s\nv%s\n' "$resolved_version" "$resolved_version"
return
fi
printf '%s\nv%s\n' "$normalized_version" "$normalized_version"
}
resolve_target_dir() {
if [ -n "$TARGET_DIR" ]; then
printf '%s\n' "$TARGET_DIR"
return
fi
case "$SCOPE" in
repo)
printf '%s/.agents/skills/feynman\n' "$PWD"
;;
user)
codex_home="${CODEX_HOME:-$HOME/.codex}"
printf '%s/skills/feynman\n' "$codex_home"
;;
*)
echo "Unknown scope: $SCOPE (expected --user or --repo)" >&2
exit 1
;;
esac
}
while [ $# -gt 0 ]; do
case "$1" in
--repo)
SCOPE="repo"
;;
--user)
SCOPE="user"
;;
--dir)
if [ $# -lt 2 ]; then
echo "Usage: install-skills.sh [stable|latest|<version>] [--user|--repo] [--dir <path>]" >&2
exit 1
fi
TARGET_DIR="$2"
shift
;;
edge|stable|latest|v*|[0-9]*)
VERSION="$1"
;;
*)
echo "Unknown argument: $1" >&2
echo "Usage: install-skills.sh [stable|latest|<version>] [--user|--repo] [--dir <path>]" >&2
exit 1
;;
esac
shift
done
archive_metadata="$(resolve_version)"
resolved_version="$(printf '%s\n' "$archive_metadata" | sed -n '1p')"
git_ref="$(printf '%s\n' "$archive_metadata" | sed -n '2p')"
archive_url="${FEYNMAN_INSTALL_SKILLS_ARCHIVE_URL:-}"
if [ -z "$archive_url" ]; then
case "$git_ref" in
main)
archive_url="https://github.com/getcompanion-ai/feynman/archive/refs/heads/main.tar.gz"
;;
v*)
archive_url="https://github.com/getcompanion-ai/feynman/archive/refs/tags/${git_ref}.tar.gz"
;;
esac
fi
if [ -z "$archive_url" ]; then
echo "Could not resolve a download URL for ref: $git_ref" >&2
exit 1
fi
install_dir="$(resolve_target_dir)"
step "Installing Feynman skills ${resolved_version} (${SCOPE})"
tmp_dir="$(mktemp -d)"
cleanup() {
rm -rf "$tmp_dir"
}
trap cleanup EXIT INT TERM
archive_path="$tmp_dir/feynman-skills.tar.gz"
step "Downloading skills archive"
download_file "$archive_url" "$archive_path"
extract_dir="$tmp_dir/extract"
mkdir -p "$extract_dir"
step "Extracting skills"
tar -xzf "$archive_path" -C "$extract_dir"
source_root="$(find "$extract_dir" -mindepth 1 -maxdepth 1 -type d | head -n 1)"
if [ -z "$source_root" ] || [ ! -d "$source_root/skills" ] || [ ! -d "$source_root/prompts" ]; then
echo "Could not find the bundled skills resources in the downloaded archive." >&2
exit 1
fi
mkdir -p "$(dirname "$install_dir")"
rm -rf "$install_dir"
mkdir -p "$install_dir"
cp -R "$source_root/skills/." "$install_dir/"
mkdir -p "$install_dir/prompts"
cp -R "$source_root/prompts/." "$install_dir/prompts/"
cp "$source_root/AGENTS.md" "$install_dir/AGENTS.md"
cp "$source_root/CONTRIBUTING.md" "$install_dir/CONTRIBUTING.md"
step "Installed skills to $install_dir"
case "$SCOPE" in
repo)
step "Repo-local skills will be discovered automatically from .agents/skills"
;;
user)
step "User-level skills will be discovered from \$CODEX_HOME/skills"
;;
esac
printf 'Feynman skills %s installed successfully.\n' "$resolved_version"

View File

@@ -0,0 +1,128 @@
param(
[string]$Version = "latest",
[ValidateSet("User", "Repo")]
[string]$Scope = "User",
[string]$TargetDir = ""
)
$ErrorActionPreference = "Stop"
function Normalize-Version {
param([string]$RequestedVersion)
if (-not $RequestedVersion) {
return "latest"
}
switch ($RequestedVersion.ToLowerInvariant()) {
"latest" { return "latest" }
"stable" { return "latest" }
"edge" { throw "The edge channel has been removed. Use the default installer for the latest tagged release or pass an exact version." }
default { return $RequestedVersion.TrimStart("v") }
}
}
function Resolve-LatestReleaseVersion {
$page = Invoke-WebRequest -Uri "https://github.com/getcompanion-ai/feynman/releases/latest"
$match = [regex]::Match($page.Content, 'releases/tag/v([0-9][^"''<>\s]*)')
if (-not $match.Success) {
throw "Failed to resolve the latest Feynman release version."
}
return $match.Groups[1].Value
}
function Resolve-VersionMetadata {
param([string]$RequestedVersion)
$normalizedVersion = Normalize-Version -RequestedVersion $RequestedVersion
if ($normalizedVersion -eq "latest") {
$resolvedVersion = Resolve-LatestReleaseVersion
} else {
$resolvedVersion = $normalizedVersion
}
return [PSCustomObject]@{
ResolvedVersion = $resolvedVersion
GitRef = "v$resolvedVersion"
DownloadUrl = if ($env:FEYNMAN_INSTALL_SKILLS_ARCHIVE_URL) { $env:FEYNMAN_INSTALL_SKILLS_ARCHIVE_URL } else { "https://github.com/getcompanion-ai/feynman/archive/refs/tags/v$resolvedVersion.zip" }
}
}
function Resolve-InstallDir {
param(
[string]$ResolvedScope,
[string]$ResolvedTargetDir
)
if ($ResolvedTargetDir) {
return $ResolvedTargetDir
}
if ($ResolvedScope -eq "Repo") {
return Join-Path (Get-Location) ".agents\skills\feynman"
}
$codexHome = if ($env:CODEX_HOME) { $env:CODEX_HOME } else { Join-Path $HOME ".codex" }
return Join-Path $codexHome "skills\feynman"
}
$metadata = Resolve-VersionMetadata -RequestedVersion $Version
$resolvedVersion = $metadata.ResolvedVersion
$downloadUrl = $metadata.DownloadUrl
$installDir = Resolve-InstallDir -ResolvedScope $Scope -ResolvedTargetDir $TargetDir
$tmpDir = Join-Path ([System.IO.Path]::GetTempPath()) ("feynman-skills-install-" + [System.Guid]::NewGuid().ToString("N"))
New-Item -ItemType Directory -Path $tmpDir | Out-Null
try {
$archivePath = Join-Path $tmpDir "feynman-skills.zip"
$extractDir = Join-Path $tmpDir "extract"
Write-Host "==> Downloading Feynman skills $resolvedVersion"
Invoke-WebRequest -Uri $downloadUrl -OutFile $archivePath
Write-Host "==> Extracting skills"
Expand-Archive -LiteralPath $archivePath -DestinationPath $extractDir -Force
$sourceRoot = Get-ChildItem -Path $extractDir -Directory | Select-Object -First 1
if (-not $sourceRoot) {
throw "Could not find extracted Feynman archive."
}
$skillsSource = Join-Path $sourceRoot.FullName "skills"
$promptsSource = Join-Path $sourceRoot.FullName "prompts"
if (-not (Test-Path $skillsSource) -or -not (Test-Path $promptsSource)) {
throw "Could not find the bundled skills resources in the downloaded archive."
}
$installParent = Split-Path $installDir -Parent
if ($installParent) {
New-Item -ItemType Directory -Path $installParent -Force | Out-Null
}
if (Test-Path $installDir) {
Remove-Item -Recurse -Force $installDir
}
New-Item -ItemType Directory -Path $installDir -Force | Out-Null
Copy-Item -Path (Join-Path $skillsSource "*") -Destination $installDir -Recurse -Force
New-Item -ItemType Directory -Path (Join-Path $installDir "prompts") -Force | Out-Null
Copy-Item -Path (Join-Path $promptsSource "*") -Destination (Join-Path $installDir "prompts") -Recurse -Force
Copy-Item -Path (Join-Path $sourceRoot.FullName "AGENTS.md") -Destination (Join-Path $installDir "AGENTS.md") -Force
Copy-Item -Path (Join-Path $sourceRoot.FullName "CONTRIBUTING.md") -Destination (Join-Path $installDir "CONTRIBUTING.md") -Force
Write-Host "==> Installed skills to $installDir"
if ($Scope -eq "Repo") {
Write-Host "Repo-local skills will be discovered automatically from .agents/skills."
} else {
Write-Host "User-level skills will be discovered from `$CODEX_HOME/skills."
}
Write-Host "Feynman skills $resolvedVersion installed successfully."
} finally {
if (Test-Path $tmpDir) {
Remove-Item -Recurse -Force $tmpDir
}
}

View File

@@ -1,5 +1,5 @@
param( param(
[string]$Version = "edge" [string]$Version = "latest"
) )
$ErrorActionPreference = "Stop" $ErrorActionPreference = "Stop"
@@ -8,17 +8,27 @@ function Normalize-Version {
param([string]$RequestedVersion) param([string]$RequestedVersion)
if (-not $RequestedVersion) { if (-not $RequestedVersion) {
return "edge" return "latest"
} }
switch ($RequestedVersion.ToLowerInvariant()) { switch ($RequestedVersion.ToLowerInvariant()) {
"edge" { return "edge" }
"latest" { return "latest" } "latest" { return "latest" }
"stable" { return "latest" } "stable" { return "latest" }
"edge" { throw "The edge channel has been removed. Use the default installer for the latest tagged release or pass an exact version." }
default { return $RequestedVersion.TrimStart("v") } default { return $RequestedVersion.TrimStart("v") }
} }
} }
function Resolve-LatestReleaseVersion {
$page = Invoke-WebRequest -Uri "https://github.com/getcompanion-ai/feynman/releases/latest"
$match = [regex]::Match($page.Content, 'releases/tag/v([0-9][^"''<>\s]*)')
if (-not $match.Success) {
throw "Failed to resolve the latest Feynman release version."
}
return $match.Groups[1].Value
}
function Resolve-ReleaseMetadata { function Resolve-ReleaseMetadata {
param( param(
[string]$RequestedVersion, [string]$RequestedVersion,
@@ -28,34 +38,8 @@ function Resolve-ReleaseMetadata {
$normalizedVersion = Normalize-Version -RequestedVersion $RequestedVersion $normalizedVersion = Normalize-Version -RequestedVersion $RequestedVersion
if ($normalizedVersion -eq "edge") {
$release = Invoke-RestMethod -Uri "https://api.github.com/repos/getcompanion-ai/feynman/releases/tags/edge"
$asset = $release.assets | Where-Object { $_.name -like "feynman-*-$AssetTarget.$BundleExtension" } | Select-Object -First 1
if (-not $asset) {
throw "Failed to resolve the latest Feynman edge bundle."
}
$archiveName = $asset.name
$suffix = ".$BundleExtension"
$bundleName = $archiveName.Substring(0, $archiveName.Length - $suffix.Length)
$resolvedVersion = $bundleName.Substring("feynman-".Length)
$resolvedVersion = $resolvedVersion.Substring(0, $resolvedVersion.Length - ("-$AssetTarget").Length)
return [PSCustomObject]@{
ResolvedVersion = $resolvedVersion
BundleName = $bundleName
ArchiveName = $archiveName
DownloadUrl = $asset.browser_download_url
}
}
if ($normalizedVersion -eq "latest") { if ($normalizedVersion -eq "latest") {
$release = Invoke-RestMethod -Uri "https://api.github.com/repos/getcompanion-ai/feynman/releases/latest" $resolvedVersion = Resolve-LatestReleaseVersion
if (-not $release.tag_name) {
throw "Failed to resolve the latest Feynman release version."
}
$resolvedVersion = $release.tag_name.TrimStart("v")
} else { } else {
$resolvedVersion = $normalizedVersion $resolvedVersion = $normalizedVersion
} }
@@ -73,12 +57,26 @@ function Resolve-ReleaseMetadata {
} }
function Get-ArchSuffix { function Get-ArchSuffix {
$arch = [System.Runtime.InteropServices.RuntimeInformation]::OSArchitecture # Prefer PROCESSOR_ARCHITECTURE which is always available on Windows.
switch ($arch.ToString()) { # RuntimeInformation::OSArchitecture requires .NET 4.7.1+ and may not
"X64" { return "x64" } # be loaded in every Windows PowerShell 5.1 session.
"Arm64" { return "arm64" } $envArch = $env:PROCESSOR_ARCHITECTURE
default { throw "Unsupported architecture: $arch" } if ($envArch) {
switch ($envArch) {
"AMD64" { return "x64" }
"ARM64" { return "arm64" }
}
} }
try {
$arch = [System.Runtime.InteropServices.RuntimeInformation]::OSArchitecture
switch ($arch.ToString()) {
"X64" { return "x64" }
"Arm64" { return "arm64" }
}
} catch {}
throw "Unsupported architecture: $envArch"
} }
$archSuffix = Get-ArchSuffix $archSuffix = Get-ArchSuffix
@@ -111,8 +109,8 @@ This usually means the release exists, but not all platform bundles were uploade
Workarounds: Workarounds:
- try again after the release finishes publishing - try again after the release finishes publishing
- install via pnpm instead: pnpm add -g @companion-ai/feynman - pass the latest published version explicitly, e.g.:
- install via bun instead: bun add -g @companion-ai/feynman & ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.16
"@ "@
} }
@@ -127,14 +125,24 @@ Workarounds:
New-Item -ItemType Directory -Path $installBinDir -Force | Out-Null New-Item -ItemType Directory -Path $installBinDir -Force | Out-Null
$shimPath = Join-Path $installBinDir "feynman.cmd" $shimPath = Join-Path $installBinDir "feynman.cmd"
$shimPs1Path = Join-Path $installBinDir "feynman.ps1"
Write-Host "==> Linking feynman into $installBinDir" Write-Host "==> Linking feynman into $installBinDir"
@" @"
@echo off @echo off
"$bundleDir\feynman.cmd" %* CALL "$bundleDir\feynman.cmd" %*
"@ | Set-Content -Path $shimPath -Encoding ASCII "@ | Set-Content -Path $shimPath -Encoding ASCII
@"
`$BundleDir = "$bundleDir"
& "`$BundleDir\node\node.exe" "`$BundleDir\app\bin\feynman.js" @args
"@ | Set-Content -Path $shimPs1Path -Encoding UTF8
$currentUserPath = [Environment]::GetEnvironmentVariable("Path", "User") $currentUserPath = [Environment]::GetEnvironmentVariable("Path", "User")
if (-not $currentUserPath.Split(';').Contains($installBinDir)) { $alreadyOnPath = $false
if ($currentUserPath) {
$alreadyOnPath = $currentUserPath.Split(';') -contains $installBinDir
}
if (-not $alreadyOnPath) {
$updatedPath = if ([string]::IsNullOrWhiteSpace($currentUserPath)) { $updatedPath = if ([string]::IsNullOrWhiteSpace($currentUserPath)) {
$installBinDir $installBinDir
} else { } else {
@@ -146,6 +154,14 @@ Workarounds:
Write-Host "$installBinDir is already on PATH." Write-Host "$installBinDir is already on PATH."
} }
$resolvedCommand = Get-Command feynman -ErrorAction SilentlyContinue
if ($resolvedCommand -and $resolvedCommand.Source -ne $shimPath) {
Write-Warning "Current shell resolves feynman to $($resolvedCommand.Source)"
Write-Host "Run in a new shell, or run: `$env:Path = '$installBinDir;' + `$env:Path"
Write-Host "Then run: feynman"
Write-Host "If that path is an old package-manager install, remove it or put $installBinDir first on PATH."
}
Write-Host "Feynman $resolvedVersion installed successfully." Write-Host "Feynman $resolvedVersion installed successfully."
} finally { } finally {
if (Test-Path $tmpDir) { if (Test-Path $tmpDir) {

View File

@@ -46,4 +46,4 @@ function Badge({
) )
} }
export { Badge, badgeVariants } export { Badge }

View File

@@ -64,4 +64,4 @@ function Button({
) )
} }
export { Button, buttonVariants } export { Button }

View File

@@ -41,6 +41,36 @@ To see all models you have configured:
feynman model list feynman model list
``` ```
Only authenticated/configured providers appear in `feynman model list`. If you only see OpenAI models, it usually means only OpenAI auth is configured so far.
To add another provider, authenticate it first:
```bash
feynman model login anthropic
feynman model login google
```
Then switch the default model:
```bash
feynman model set anthropic/claude-opus-4-6
```
## Subagent model overrides
Feynman's bundled subagents inherit the main default model unless you override them explicitly. Inside the REPL, run:
```bash
/feynman-model
```
This opens an interactive picker where you can either:
- change the main default model for the session environment
- assign a different model to a specific bundled subagent such as `researcher`, `reviewer`, `writer`, or `verifier`
Per-subagent overrides are persisted in the synced agent files under `~/.feynman/agent/agents/` with a `model:` frontmatter field. Removing that field makes the subagent inherit the main default model again.
## Thinking levels ## Thinking levels
The `thinkingLevel` field controls how much reasoning the model does before responding. Available levels are `off`, `minimal`, `low`, `medium`, `high`, and `xhigh`. Higher levels produce more thorough analysis at the cost of latency and token usage. You can override per-session: The `thinkingLevel` field controls how much reasoning the model does before responding. Available levels are `off`, `minimal`, `low`, `medium`, `high`, and `xhigh`. Higher levels produce more thorough analysis at the cost of latency and token usage. You can override per-session:

View File

@@ -1,11 +1,11 @@
--- ---
title: Installation title: Installation
description: Install Feynman on macOS, Linux, or Windows using curl, pnpm, or bun. description: Install Feynman on macOS, Linux, or Windows using the standalone installer.
section: Getting Started section: Getting Started
order: 1 order: 1
--- ---
Feynman ships as a standalone runtime bundle for macOS, Linux, and Windows, and as a package-manager install for environments where Node.js is already installed. The recommended approach is the one-line installer, which downloads a prebuilt native bundle with zero external runtime dependencies. Feynman ships as a standalone runtime bundle for macOS, Linux, and Windows. The one-line installer downloads a prebuilt native bundle with zero external runtime dependencies.
## One-line installer (recommended) ## One-line installer (recommended)
@@ -17,7 +17,7 @@ curl -fsSL https://feynman.is/install | bash
The installer detects your OS and architecture automatically. On macOS it supports both Intel and Apple Silicon. On Linux it supports x64 and arm64. The launcher is installed to `~/.local/bin`, the bundled runtime is unpacked into `~/.local/share/feynman`, and your `PATH` is updated when needed. The installer detects your OS and architecture automatically. On macOS it supports both Intel and Apple Silicon. On Linux it supports x64 and arm64. The launcher is installed to `~/.local/bin`, the bundled runtime is unpacked into `~/.local/share/feynman`, and your `PATH` is updated when needed.
By default, the one-line installer tracks the rolling `edge` channel from `main`. If you previously installed Feynman through a package manager and still see local Node.js errors after a curl install, your shell is probably still resolving the older global binary first. Run `which -a feynman`, then `hash -r`, or launch the standalone shim directly with `~/.local/bin/feynman`.
On **Windows**, open PowerShell as Administrator and run: On **Windows**, open PowerShell as Administrator and run:
@@ -27,50 +27,50 @@ irm https://feynman.is/install.ps1 | iex
This installs the Windows runtime bundle under `%LOCALAPPDATA%\Programs\feynman`, adds its launcher to your user `PATH`, and lets you re-run the installer at any time to update. This installs the Windows runtime bundle under `%LOCALAPPDATA%\Programs\feynman`, adds its launcher to your user `PATH`, and lets you re-run the installer at any time to update.
## Stable or pinned releases ## Skills only
If you want the latest tagged release instead of the rolling `edge` channel: If you only want Feynman's research skills and not the full terminal runtime, install the skill library separately.
For a user-level install into `~/.codex/skills/feynman`:
```bash ```bash
curl -fsSL https://feynman.is/install | bash -s -- stable curl -fsSL https://feynman.is/install-skills | bash
```
For a repo-local install into `.agents/skills/feynman` under the current repository:
```bash
curl -fsSL https://feynman.is/install-skills | bash -s -- --repo
```
On Windows, install the skills into your Codex skill directory:
```powershell
irm https://feynman.is/install-skills.ps1 | iex
```
Or install them repo-locally:
```powershell
& ([scriptblock]::Create((irm https://feynman.is/install-skills.ps1))) -Scope Repo
```
These installers download the bundled `skills/` and `prompts/` trees plus the repo guidance files referenced by those skills. They do not install the Feynman terminal, bundled Node runtime, auth storage, or Pi packages.
## Pinned releases
The one-line installer already targets the latest tagged release. To pin an exact version, pass it explicitly:
```bash
curl -fsSL https://feynman.is/install | bash -s -- 0.2.17
``` ```
On Windows: On Windows:
```powershell ```powershell
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version stable & ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.17
``` ```
You can also pin an exact version by replacing `stable` with a version such as `0.2.13`.
## pnpm
If you already have Node.js 20.18.1+ installed, you can install Feynman globally via `pnpm`:
```bash
pnpm add -g @companion-ai/feynman
```
Or run it directly without installing:
```bash
pnpm dlx @companion-ai/feynman
```
## bun
```bash
bun add -g @companion-ai/feynman
```
Or run it directly without installing:
```bash
bunx @companion-ai/feynman
```
Both package-manager distributions ship the same core application but depend on Node.js being present on your system. The standalone installer is preferred because it bundles its own Node runtime and works without a separate Node installation.
## Post-install setup ## Post-install setup
After installation, run the guided setup wizard to configure your model provider and API keys: After installation, run the guided setup wizard to configure your model provider and API keys:
@@ -98,6 +98,7 @@ For contributing or running Feynman from source:
```bash ```bash
git clone https://github.com/getcompanion-ai/feynman.git git clone https://github.com/getcompanion-ai/feynman.git
cd feynman cd feynman
pnpm install nvm use || nvm install
pnpm start npm install
npm start
``` ```

View File

@@ -42,6 +42,33 @@ For API key providers, you are prompted to paste your key directly:
Keys are encrypted at rest and never sent anywhere except the provider's API endpoint. Keys are encrypted at rest and never sent anywhere except the provider's API endpoint.
### Local models: Ollama, LM Studio, vLLM
If you want to use a model running locally, choose the API-key flow and then select:
```text
Custom provider (baseUrl + API key)
```
For Ollama, the typical settings are:
```text
API mode: openai-completions
Base URL: http://localhost:11434/v1
Authorization header: No
Model ids: llama3.1:8b
API key: local
```
That same custom-provider flow also works for other OpenAI-compatible local servers such as LM Studio or vLLM. After saving the provider, run:
```bash
feynman model list
feynman model set <provider>/<model-id>
```
to confirm the local model is available and make it the default.
## Stage 3: Optional packages ## Stage 3: Optional packages
Feynman's core ships with the essentials, but some features require additional packages. The wizard asks if you want to install optional presets: Feynman's core ships with the essentials, but some features require additional packages. The wizard asks if you want to install optional presets:

Some files were not shown because too many files have changed in this diff Show More