16 Commits

Author SHA1 Message Date
Advait Paliwal
e2fdf0d505 fix: exclude release bundles from npm publish 2026-03-27 14:04:16 -07:00
Advait Paliwal
cba7532d59 release: bump to 0.2.15 2026-03-27 13:58:55 -07:00
topeuph-ai
2dea96f25f feat: add valichord-validation skill — blind commit-reveal reproducibility verification
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 13:57:41 -07:00
Advait Paliwal
83a570235f docs: add contributor guide and repo skill 2026-03-27 12:09:09 -07:00
Advait Paliwal
ff6328121e fix: align .nvmrc with supported node floor 2026-03-27 11:36:49 -07:00
Advait Paliwal
404c8b5469 Unify installers on tagged releases 2026-03-26 18:17:48 -07:00
Advait Paliwal
4c62e78ca5 fix: enforce bundled node version floor 2026-03-26 17:49:11 -07:00
Advait Paliwal
10c93a673b fix: align declared node version floor 2026-03-26 17:22:56 -07:00
Mochamad Chairulridjal
30d07246d1 feat: add API key and custom provider configuration (#4)
* feat: add API key and custom provider configuration

Previously, model setup only offered OAuth login. This adds:

- API key configuration for 17 built-in providers (OpenAI, Anthropic,
  Google, Mistral, Groq, xAI, OpenRouter, etc.)
- Custom provider setup via models.json (for Ollama, vLLM, LM Studio,
  proxies, or any OpenAI/Anthropic/Google-compatible endpoint)
- Interactive prompts with smart defaults and auto-detection of models
- Verification flow that probes endpoints and provides actionable tips
- Doctor diagnostics for models.json path and missing apiKey warnings
- Dev environment fallback for running without dist/ build artifacts
- Unified auth flow: `feynman model login` now offers both API key
  and OAuth options (OAuth-only when a specific provider is given)

New files:
- src/model/models-json.ts: Read/write models.json with proper merging
- src/model/registry.ts: Centralized ModelRegistry creation with modelsJsonPath
- tests/models-json.test.ts: Unit tests for provider config upsert

* fix: harden runtime env and custom provider auth

---------

Co-authored-by: Advait Paliwal <advaitspaliwal@gmail.com>
2026-03-26 17:09:38 -07:00
Jeremy
dbd89d8e3d Claude/windows install compatibility tr di s (#3)
* Fix Windows PowerShell 5.1 compatibility in installer

Use $env:PROCESSOR_ARCHITECTURE for arch detection instead of
RuntimeInformation::OSArchitecture which may not be loaded in
every Windows PowerShell 5.1 session. Also fix null-reference
when user PATH environment variable is empty.

https://claude.ai/code/session_01VFiRDM2ZweyacXN5JneVoP

* Fix executable resolution and tar extraction on Windows

resolveExecutable() used `sh -lc "command -v ..."` which doesn't work
on Windows (no sh). Now uses `cmd /c where` on win32. Also make tar
workspace restoration tolerate symlink failures on Windows — .bin/
symlinks can't be created without Developer Mode, but the actual
package directories are extracted fine.

https://claude.ai/code/session_01VFiRDM2ZweyacXN5JneVoP

* Broad Windows compatibility fixes across the codebase

- runtime.ts: Use path.delimiter instead of hardcoded ":" for PATH
  construction — was completely broken on Windows
- executables.ts: Add Windows fallback paths for Chrome, Edge, Brave,
  and Pandoc in Program Files; skip macOS-only paths on win32
- node-version.ts, check-node-version.mjs, bin/feynman.js: Show
  Windows-appropriate install instructions (irm | iex, nodejs.org)
  instead of nvm/curl on win32
- preview.ts: Support winget for pandoc auto-install on Windows, and
  apt on Linux (was macOS/brew only)
- launch.ts: Catch unsupported signal errors on Windows
- README.md: Add Windows PowerShell commands alongside macOS/Linux
  for all install instructions

https://claude.ai/code/session_01VFiRDM2ZweyacXN5JneVoP

* fix: complete windows bootstrap hardening

---------

Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Advait Paliwal <advaitspaliwal@gmail.com>
2026-03-26 17:08:14 -07:00
Advait Paliwal
c8536583bf Add skills-only installers 2026-03-25 14:52:20 -07:00
Advait Paliwal
ca74226c83 Fix mobile website overflow 2026-03-25 14:42:08 -07:00
Advait Paliwal
bc9fa2be86 Fix runtime package resolution and tty shutdown 2026-03-25 14:02:38 -07:00
Advait Paliwal
f6dbacc9d5 Update runtime checks and installer behavior 2026-03-25 13:55:32 -07:00
Advait Paliwal
572de7ba85 Clean up README: single install line, fix replicate descriptions
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 12:29:34 -07:00
Advait Paliwal
85e0c4d8c4 Register alphaXiv research tools as native Pi tools
Replace the alpha-research CLI skill with direct programmatic Pi tool
registrations via @companion-ai/alpha-hub/lib. Tools connect to alphaXiv's
MCP server through the library and reuse the connection across calls
instead of spawning a new CLI process each time.

Registers: alpha_search, alpha_get_paper, alpha_ask_paper,
alpha_annotate_paper, alpha_list_annotations, alpha_read_code.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 11:07:42 -07:00
57 changed files with 2860 additions and 304 deletions

154
.astro/content.d.ts vendored Normal file
View File

@@ -0,0 +1,154 @@
declare module 'astro:content' {
export interface RenderResult {
Content: import('astro/runtime/server/index.js').AstroComponentFactory;
headings: import('astro').MarkdownHeading[];
remarkPluginFrontmatter: Record<string, any>;
}
interface Render {
'.md': Promise<RenderResult>;
}
export interface RenderedContent {
html: string;
metadata?: {
imagePaths: Array<string>;
[key: string]: unknown;
};
}
type Flatten<T> = T extends { [K: string]: infer U } ? U : never;
export type CollectionKey = keyof DataEntryMap;
export type CollectionEntry<C extends CollectionKey> = Flatten<DataEntryMap[C]>;
type AllValuesOf<T> = T extends any ? T[keyof T] : never;
export type ReferenceDataEntry<
C extends CollectionKey,
E extends keyof DataEntryMap[C] = string,
> = {
collection: C;
id: E;
};
export type ReferenceLiveEntry<C extends keyof LiveContentConfig['collections']> = {
collection: C;
id: string;
};
export function getCollection<C extends keyof DataEntryMap, E extends CollectionEntry<C>>(
collection: C,
filter?: (entry: CollectionEntry<C>) => entry is E,
): Promise<E[]>;
export function getCollection<C extends keyof DataEntryMap>(
collection: C,
filter?: (entry: CollectionEntry<C>) => unknown,
): Promise<CollectionEntry<C>[]>;
export function getLiveCollection<C extends keyof LiveContentConfig['collections']>(
collection: C,
filter?: LiveLoaderCollectionFilterType<C>,
): Promise<
import('astro').LiveDataCollectionResult<LiveLoaderDataType<C>, LiveLoaderErrorType<C>>
>;
export function getEntry<
C extends keyof DataEntryMap,
E extends keyof DataEntryMap[C] | (string & {}),
>(
entry: ReferenceDataEntry<C, E>,
): E extends keyof DataEntryMap[C]
? Promise<DataEntryMap[C][E]>
: Promise<CollectionEntry<C> | undefined>;
export function getEntry<
C extends keyof DataEntryMap,
E extends keyof DataEntryMap[C] | (string & {}),
>(
collection: C,
id: E,
): E extends keyof DataEntryMap[C]
? string extends keyof DataEntryMap[C]
? Promise<DataEntryMap[C][E]> | undefined
: Promise<DataEntryMap[C][E]>
: Promise<CollectionEntry<C> | undefined>;
export function getLiveEntry<C extends keyof LiveContentConfig['collections']>(
collection: C,
filter: string | LiveLoaderEntryFilterType<C>,
): Promise<import('astro').LiveDataEntryResult<LiveLoaderDataType<C>, LiveLoaderErrorType<C>>>;
/** Resolve an array of entry references from the same collection */
export function getEntries<C extends keyof DataEntryMap>(
entries: ReferenceDataEntry<C, keyof DataEntryMap[C]>[],
): Promise<CollectionEntry<C>[]>;
export function render<C extends keyof DataEntryMap>(
entry: DataEntryMap[C][string],
): Promise<RenderResult>;
export function reference<
C extends
| keyof DataEntryMap
// Allow generic `string` to avoid excessive type errors in the config
// if `dev` is not running to update as you edit.
// Invalid collection names will be caught at build time.
| (string & {}),
>(
collection: C,
): import('astro/zod').ZodPipe<
import('astro/zod').ZodString,
import('astro/zod').ZodTransform<
C extends keyof DataEntryMap
? {
collection: C;
id: string;
}
: never,
string
>
>;
type ReturnTypeOrOriginal<T> = T extends (...args: any[]) => infer R ? R : T;
type InferEntrySchema<C extends keyof DataEntryMap> = import('astro/zod').infer<
ReturnTypeOrOriginal<Required<ContentConfig['collections'][C]>['schema']>
>;
type ExtractLoaderConfig<T> = T extends { loader: infer L } ? L : never;
type InferLoaderSchema<
C extends keyof DataEntryMap,
L = ExtractLoaderConfig<ContentConfig['collections'][C]>,
> = L extends { schema: import('astro/zod').ZodSchema }
? import('astro/zod').infer<L['schema']>
: any;
type DataEntryMap = {
};
type ExtractLoaderTypes<T> = T extends import('astro/loaders').LiveLoader<
infer TData,
infer TEntryFilter,
infer TCollectionFilter,
infer TError
>
? { data: TData; entryFilter: TEntryFilter; collectionFilter: TCollectionFilter; error: TError }
: { data: never; entryFilter: never; collectionFilter: never; error: never };
type ExtractEntryFilterType<T> = ExtractLoaderTypes<T>['entryFilter'];
type ExtractCollectionFilterType<T> = ExtractLoaderTypes<T>['collectionFilter'];
type ExtractErrorType<T> = ExtractLoaderTypes<T>['error'];
type LiveLoaderDataType<C extends keyof LiveContentConfig['collections']> =
LiveContentConfig['collections'][C]['schema'] extends undefined
? ExtractDataType<LiveContentConfig['collections'][C]['loader']>
: import('astro/zod').infer<
Exclude<LiveContentConfig['collections'][C]['schema'], undefined>
>;
type LiveLoaderEntryFilterType<C extends keyof LiveContentConfig['collections']> =
ExtractEntryFilterType<LiveContentConfig['collections'][C]['loader']>;
type LiveLoaderCollectionFilterType<C extends keyof LiveContentConfig['collections']> =
ExtractCollectionFilterType<LiveContentConfig['collections'][C]['loader']>;
type LiveLoaderErrorType<C extends keyof LiveContentConfig['collections']> = ExtractErrorType<
LiveContentConfig['collections'][C]['loader']
>;
export type ContentConfig = never;
export type LiveContentConfig = never;
}

2
.astro/types.d.ts vendored Normal file
View File

@@ -0,0 +1,2 @@
/// <reference types="astro/client" />
/// <reference path="content.d.ts" />

View File

@@ -6,6 +6,20 @@ FEYNMAN_THINKING=medium
OPENAI_API_KEY=
ANTHROPIC_API_KEY=
GEMINI_API_KEY=
OPENROUTER_API_KEY=
ZAI_API_KEY=
KIMI_API_KEY=
MINIMAX_API_KEY=
MINIMAX_CN_API_KEY=
MISTRAL_API_KEY=
GROQ_API_KEY=
XAI_API_KEY=
CEREBRAS_API_KEY=
HF_TOKEN=
OPENCODE_API_KEY=
AI_GATEWAY_API_KEY=
AZURE_OPENAI_API_KEY=
RUNPOD_API_KEY=
MODAL_TOKEN_ID=

View File

@@ -9,7 +9,7 @@ Operating rules:
- State uncertainty explicitly.
- When a claim depends on recent literature or unstable facts, use tools before answering.
- When discussing papers, cite title, year, and identifier or URL when possible.
- Use the alpha-research skill for academic paper search, paper reading, paper Q&A, repository inspection, and persistent annotations.
- Use the `alpha` CLI for academic paper search, paper reading, paper Q&A, repository inspection, and persistent annotations.
- Use `web_search`, `fetch_content`, and `get_search_content` first for current topics: products, companies, markets, regulations, software releases, model availability, model pricing, benchmarks, docs, or anything phrased as latest/current/recent/today.
- For mixed topics, combine both: use web sources for current reality and paper sources for background literature.
- Never answer a latest/current question from arXiv or alpha-backed paper search alone.

View File

@@ -1,5 +1,6 @@
{
"packages": [
"npm:@companion-ai/alpha-hub",
"npm:pi-subagents",
"npm:pi-btw",
"npm:pi-docparser",

View File

@@ -52,6 +52,7 @@ jobs:
build-native-bundles:
needs: version-check
if: needs.version-check.outputs.should_publish == 'true'
strategy:
fail-fast: false
matrix:
@@ -97,47 +98,6 @@ jobs:
name: native-${{ matrix.id }}
path: dist/release/*
release-edge:
needs:
- version-check
- build-native-bundles
if: needs.build-native-bundles.result == 'success'
runs-on: blacksmith-4vcpu-ubuntu-2404
permissions:
contents: write
steps:
- uses: actions/download-artifact@v4
with:
path: release-assets
merge-multiple: true
- shell: bash
env:
GH_REPO: ${{ github.repository }}
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
VERSION: ${{ needs.version-check.outputs.version }}
run: |
NOTES="Rolling Feynman bundles from main for the curl/PowerShell installer."
if gh release view edge >/dev/null 2>&1; then
gh release view edge --json assets --jq '.assets[].name' | while IFS= read -r asset; do
[ -n "$asset" ] || continue
gh release delete-asset edge "$asset" --yes
done
gh release upload edge release-assets/*
gh release edit edge \
--title "edge" \
--notes "$NOTES" \
--prerelease \
--draft=false \
--target "$GITHUB_SHA"
else
gh release create edge release-assets/* \
--title "edge" \
--notes "$NOTES" \
--prerelease \
--latest=false \
--target "$GITHUB_SHA"
fi
release-github:
needs:
- version-check

1
.nvmrc Normal file
View File

@@ -0,0 +1 @@
20.19.0

View File

@@ -14,3 +14,66 @@ Use this file to track chronology, not release notes. Keep entries short, factua
- Failed / learned: ...
- Blockers: ...
- Next: ...
### 2026-03-25 00:00 local — scaling-laws
- Objective: Set up a deep research workflow for scaling laws.
- Changed: Created plan artifact at `outputs/.plans/scaling-laws.md`; defined 4 disjoint researcher dimensions and acceptance criteria.
- Verified: Read `CHANGELOG.md` and checked prior memory for related plan `scaling-laws-implications`.
- Failed / learned: No prior run-specific changelog entries existed beyond the template.
- Blockers: Waiting for user confirmation before launching researcher round 1.
- Next: On confirmation, spawn 4 parallel researcher subagents and begin evidence collection.
### 2026-03-25 00:30 local — scaling-laws (T4 inference/time-scale pass)
- Objective: Complete T4 on inference/test-time scaling and reasoning-time compute, scoped to 20232026.
- Changed: Wrote `notes/scaling-laws-research-inference.md`; updated `outputs/.plans/scaling-laws.md` to mark T4 done and log the inference-scaling verification pass.
- Verified: Cross-read 13 primary/official sources covering Tree-of-Thoughts, PRMs, repeated sampling, compute-optimal test-time scaling, provable laws, o1, DeepSeek-R1, s1, verifier failures, Anthropic extended thinking, and OpenAI reasoning API docs.
- Failed / learned: OpenAI blog fetch for `learning-to-reason-with-llms` returned malformed content, so the note leans on the o1 system card and API docs instead of that blog post.
- Blockers: T2 and T5 remain open before final synthesis; no single unified law for inference-time scaling emerged from public sources.
- Next: Complete T5 implications synthesis, then reconcile T3/T4 with foundational T2 before drafting the cited brief.
### 2026-03-25 11:20 local — scaling-laws (T6 draft synthesis)
- Objective: Synthesize the four research notes into a single user-facing draft brief for the scaling-laws workflow.
- Changed: Wrote `outputs/.drafts/scaling-laws-draft.md` with an executive summary, curated reading list, qualitative meta-analysis, core-paper comparison table, explicit training-vs-inference distinction, and numbered inline citations with direct-URL sources.
- Verified: Cross-checked the draft against `notes/scaling-laws-research-foundations.md`, `notes/scaling-laws-research-revisions.md`, `notes/scaling-laws-research-inference.md`, and `notes/scaling-laws-research-implications.md` to ensure the brief explicitly states the literature is too heterogeneous for a pooled effect-size estimate.
- Failed / learned: The requested temp-run `context.md` and `plan.md` were absent, so the synthesis used `outputs/.plans/scaling-laws.md` plus the four note files as the working context.
- Blockers: Citation/claim verification pass still pending; this draft should be treated as pre-verification.
- Next: Run verifier/reviewer passes, then promote the draft into the final cited brief and provenance sidecar.
### 2026-03-25 11:28 local — scaling-laws (final brief + pdf)
- Objective: Deliver a paper guide and qualitative meta-analysis on AI scaling laws.
- Changed: Finalized `outputs/scaling-laws.md` and sidecar `outputs/scaling-laws.provenance.md`; rendered preview PDF at `outputs/scaling-laws.pdf`; updated plan ledger and verification log in `outputs/.plans/scaling-laws.md`.
- Verified: Ran a reviewer pass recorded in `notes/scaling-laws-verification.md`; spot-checked key primary papers via alpha-backed reads for Kaplan 2020, Chinchilla 2022, and Snell 2024; confirmed PDF render output exists.
- Failed / learned: A pooled statistical meta-analysis would be misleading because the literature mixes heterogeneous outcomes, scaling axes, and evaluation regimes; final deliverable uses a qualitative meta-analysis instead.
- Blockers: None for this brief.
- Next: If needed, extend into a narrower sub-survey (e.g. only pretraining laws, only inference-time scaling, or only post-Chinchilla data-quality revisions).
### 2026-03-25 14:52 local — skills-only-install
- Objective: Let users download the Feynman research skills without installing the full terminal runtime.
- Changed: Added standalone skills-only installers at `scripts/install/install-skills.sh` and `scripts/install/install-skills.ps1`; synced website-public copies; documented user-level and repo-local install flows in `README.md`, `website/src/content/docs/getting-started/installation.md`, and `website/src/pages/index.astro`.
- Verified: Ran `sh -n scripts/install/install-skills.sh`; ran `node scripts/sync-website-installers.mjs`; ran `cd website && npm run build`; executed `sh scripts/install/install-skills.sh --dir <tmp>` and confirmed extracted `SKILL.md` files land in the target directory.
- Failed / learned: PowerShell installer behavior was not executed locally because PowerShell is not installed in this environment.
- Blockers: None for the Unix installer flow; Windows remains syntax-only by inspection.
- Next: If users want this exposed more prominently, add a dedicated docs/reference page and a homepage-specific skills-only CTA instead of a text link.
### 2026-03-26 18:08 PDT — installer-release-unification
- Objective: Remove the moving `edge` installer channel and unify installs on tagged releases only.
- Changed: Updated `scripts/install/install.sh`, `scripts/install/install.ps1`, `scripts/install/install-skills.sh`, and `scripts/install/install-skills.ps1` so the default target is the latest tagged release, latest-version resolution uses public GitHub release pages instead of `api.github.com`, and explicit `edge` requests now fail with a removal message; removed the `release-edge` job from `.github/workflows/publish.yml`; updated `README.md` and `website/src/content/docs/getting-started/installation.md`; re-synced `website/public/install*`.
- Verified: Ran `sh -n` on the Unix installer copies; confirmed `sh scripts/install/install.sh edge` and `sh scripts/install/install-skills.sh edge --dir <tmp>` fail with the intended removal message; executed `sh scripts/install/install.sh` into temp dirs and confirmed the installed binary reports `0.2.14`; executed `sh scripts/install/install-skills.sh --dir <tmp>` and confirmed extracted `SKILL.md` files; ran `cd website && npm run build`.
- Failed / learned: The install failure was caused by unauthenticated GitHub API rate limiting on the `edge` path, so renaming channels without removing the API dependency would not have fixed the root cause.
- Blockers: `npm run build` still emits a pre-existing duplicate-content warning for `getting-started/installation`; the build succeeds.
- Next: If desired, remove the now-unused `stable` alias too and clean up the duplicate docs-content warning separately.
### 2026-03-27 11:58 PDT — release-0.2.15
- Objective: Make the non-Anthropic subagent/auth fixes and contributor-guide updates releasable to tagged-install users instead of leaving them only on `main`.
- Changed: Bumped the package version from `0.2.14` to `0.2.15` in `package.json` and `package-lock.json`; updated pinned installer examples in `README.md` and `website/src/content/docs/getting-started/installation.md`; aligned the local-development docs example to the npm-based root workflow; added `CONTRIBUTING.md` plus the bundled `skills/contributing/SKILL.md`.
- Verified: Confirmed the publish workflow keys off `package.json` versus the currently published npm version; confirmed local `npm test`, `npm run typecheck`, and `npm run build` pass before the release bump.
- Failed / learned: The open subagent issue is fixed on `main` but still user-visible on tagged installs until a fresh release is cut.
- Blockers: Need the GitHub publish workflow to finish successfully before the issue can be honestly closed as released.
- Next: Push `0.2.15`, monitor the publish workflow, then update and close the relevant GitHub issue/PR once the release is live.

114
CONTRIBUTING.md Normal file
View File

@@ -0,0 +1,114 @@
# Contributing to Feynman
Feynman is a research-first CLI built on Pi and alphaXiv. This guide is for humans and agents contributing code, prompts, skills, docs, installers, or workflow behavior to the repository.
## Quick Links
- GitHub: https://github.com/getcompanion-ai/feynman
- Docs: https://feynman.is/docs
- Repo agent contract: [AGENTS.md](AGENTS.md)
- Issues: https://github.com/getcompanion-ai/feynman/issues
## What Goes Where
- CLI/runtime code: `src/`
- Bundled prompt templates: `prompts/`
- Bundled Pi skills: `skills/`
- Bundled Pi subagent prompts: `.feynman/agents/`
- Docs site: `website/`
- Build/release scripts: `scripts/`
- Generated research artifacts: `outputs/`, `papers/`, `notes/`
If you need to change how bundled subagents behave, edit `.feynman/agents/*.md`. Do not duplicate that behavior in `AGENTS.md`.
## Before You Open a PR
1. Start from the latest `main`.
2. Use Node.js `20.19.0` or newer. The repo expects `.nvmrc`, `package.json` engines, `website/package.json` engines, and the runtime version guard to stay aligned.
3. Install dependencies from the repo root:
```bash
nvm use || nvm install
npm install
```
4. Run the required checks before asking for review:
```bash
npm test
npm run typecheck
npm run build
```
5. If you changed the docs site, also validate the website:
```bash
cd website
npm install
npm run build
```
6. Keep the PR focused. Do not mix unrelated cleanup with the real change.
7. Add or update tests when behavior changes.
8. Update docs, prompts, or skills when the user-facing workflow changes.
## Contribution Rules
- Bugs, docs fixes, installer fixes, and focused workflow improvements are good PRs.
- Large feature changes should start with an issue or a concrete implementation discussion before code lands.
- Avoid refactor-only PRs unless they are necessary to unblock a real fix or requested by a maintainer.
- Do not silently change release behavior, installer behavior, or runtime defaults without documenting the reason in the PR.
- Use American English in docs, comments, prompts, UI copy, and examples.
## Repo-Specific Checks
### Prompt and skill changes
- New workflows usually live in `prompts/*.md`.
- New reusable capabilities usually live in `skills/<name>/SKILL.md`.
- Keep skill files concise. Put detailed operational rules in the prompt or in focused reference files only when needed.
- If a new workflow should be invokable from the CLI, make sure its prompt frontmatter includes the correct metadata and that the command works through the normal prompt discovery path.
### Agent and artifact conventions
- `AGENTS.md` is the repo-level contract for workspace conventions, handoffs, provenance, and output naming.
- Long-running research flows should write plan artifacts to `outputs/.plans/` and use `CHANGELOG.md` as a lab notebook when the work is substantial.
- Do not update `CHANGELOG.md` for trivial one-shot changes.
### Release and versioning discipline
- The curl installer and release docs point users at tagged releases, not arbitrary commits on `main`.
- If you ship user-visible fixes after a tag, do not leave the repo in a state where `main` and the latest release advertise the same version string while containing different behavior.
- When changing release-sensitive behavior, check the version story across:
- `.nvmrc`
- `package.json`
- `website/package.json`
- `scripts/check-node-version.mjs`
- install docs in `README.md` and `website/src/content/docs/getting-started/installation.md`
## AI-Assisted Contributions
AI-assisted PRs are fine. The contributor is still responsible for the diff.
- Understand the code you are submitting.
- Run the local checks yourself instead of assuming generated code is correct.
- Include enough context in the PR description for a reviewer to understand the change quickly.
- If an agent updated prompts or skills, verify the instructions match the actual repo behavior.
## Review Expectations
- Explain what changed and why.
- Call out tradeoffs, follow-up work, and anything intentionally not handled.
- Include screenshots for UI changes.
- Resolve review comments you addressed before requesting review again.
## Good First Areas
Useful contributions usually land in one of these areas:
- installation and upgrade reliability
- research workflow quality
- model/provider setup ergonomics
- docs clarity
- preview and export stability
- packaging and release hygiene

View File

@@ -13,19 +13,55 @@
### Installation
**macOS / Linux:**
```bash
curl -fsSL https://feynman.is/install | bash
# stable release channel
curl -fsSL https://feynman.is/install | bash -s -- stable
# package manager fallback
pnpm add -g @companion-ai/feynman
bun add -g @companion-ai/feynman
```
The one-line installer tracks the latest `main` build. Use `stable` or an exact version to pin a release. Then run `feynman setup` to configure your model and get started.
**Windows (PowerShell):**
```powershell
irm https://feynman.is/install.ps1 | iex
```
The one-line installer fetches the latest tagged release. To pin a version, pass it explicitly, for example `curl -fsSL https://feynman.is/install | bash -s -- 0.2.15`.
If you install via `pnpm` or `bun` instead of the standalone bundle, Feynman requires Node.js `20.19.0` or newer.
### Skills Only
If you want just the research skills without the full terminal app:
**macOS / Linux:**
```bash
curl -fsSL https://feynman.is/install-skills | bash
```
**Windows (PowerShell):**
```powershell
irm https://feynman.is/install-skills.ps1 | iex
```
That installs the skill library into `~/.codex/skills/feynman`.
For a repo-local install instead:
**macOS / Linux:**
```bash
curl -fsSL https://feynman.is/install-skills | bash -s -- --repo
```
**Windows (PowerShell):**
```powershell
& ([scriptblock]::Create((irm https://feynman.is/install-skills.ps1))) -Scope Repo
```
That installs into `.agents/skills/feynman` under the current repository.
---
@@ -45,7 +81,10 @@ $ feynman audit 2401.12345
→ Compares paper claims against the public codebase
$ feynman replicate "chain-of-thought improves math"
Asks where to run, then builds a replication plan
Replicates experiments on local or cloud GPUs
$ feynman valichord "study-id-or-topic"
→ Runs the ValiChord reproducibility workflow or checks existing Harmony Records
```
---
@@ -60,7 +99,8 @@ Ask naturally or use slash commands as shortcuts.
| `/lit <topic>` | Literature review from paper search and primary sources |
| `/review <artifact>` | Simulated peer review with severity and revision plan |
| `/audit <item>` | Paper vs. codebase mismatch audit |
| `/replicate <paper>` | Replication plan with environment selection |
| `/replicate <paper>` | Replicate experiments on local or cloud GPUs |
| `/valichord <study-or-topic>` | Reproducibility attestation workflow and Harmony Record lookup |
| `/compare <topic>` | Source comparison matrix |
| `/draft <topic>` | Paper-style draft from research findings |
| `/autoresearch <idea>` | Autonomous experiment loop |
@@ -100,9 +140,16 @@ Built on [Pi](https://github.com/badlogic/pi-mono) for the agent runtime, [alpha
### Contributing
See [CONTRIBUTING.md](CONTRIBUTING.md) for the full contributor guide.
```bash
git clone https://github.com/getcompanion-ai/feynman.git
cd feynman && pnpm install && pnpm start
cd feynman
nvm use || nvm install
npm install
npm test
npm run typecheck
npm run build
```
[Docs](https://feynman.is/docs) · [MIT License](LICENSE)

View File

@@ -1,9 +1,31 @@
#!/usr/bin/env node
const v = process.versions.node.split(".").map(Number);
if (v[0] < 20) {
console.error(`feynman requires Node.js 20 or later (you have ${process.versions.node})`);
console.error("upgrade: https://nodejs.org or nvm install 20");
const MIN_NODE_VERSION = "20.19.0";
function parseNodeVersion(version) {
const [major = "0", minor = "0", patch = "0"] = version.replace(/^v/, "").split(".");
return {
major: Number.parseInt(major, 10) || 0,
minor: Number.parseInt(minor, 10) || 0,
patch: Number.parseInt(patch, 10) || 0,
};
}
function compareNodeVersions(left, right) {
if (left.major !== right.major) return left.major - right.major;
if (left.minor !== right.minor) return left.minor - right.minor;
return left.patch - right.patch;
}
if (compareNodeVersions(parseNodeVersion(process.versions.node), parseNodeVersion(MIN_NODE_VERSION)) < 0) {
const isWindows = process.platform === "win32";
console.error(`feynman requires Node.js ${MIN_NODE_VERSION} or later (detected ${process.versions.node}).`);
console.error(isWindows
? "Install a newer Node.js from https://nodejs.org, or use the standalone installer:"
: "Switch to Node 20 with `nvm install 20 && nvm use 20`, or use the standalone installer:");
console.error(isWindows
? "irm https://feynman.is/install.ps1 | iex"
: "curl -fsSL https://feynman.is/install | bash");
process.exit(1);
}
await import("../scripts/patch-embedded-pi.mjs");
await import("../dist/index.js");
await import(new URL("../scripts/patch-embedded-pi.mjs", import.meta.url).href);
await import(new URL("../dist/index.js", import.meta.url).href);

View File

@@ -1,5 +1,6 @@
import type { ExtensionAPI } from "@mariozechner/pi-coding-agent";
import { registerAlphaTools } from "./research-tools/alpha.js";
import { installFeynmanHeader } from "./research-tools/header.js";
import { registerHelpCommand } from "./research-tools/help.js";
import { registerInitCommand, registerOutputsCommand } from "./research-tools/project.js";
@@ -15,6 +16,7 @@ export default function researchTools(pi: ExtensionAPI): void {
await installFeynmanHeader(pi, ctx, cache);
});
registerAlphaTools(pi);
registerHelpCommand(pi);
registerInitCommand(pi);
registerOutputsCommand(pi);

View File

@@ -0,0 +1,107 @@
import {
askPaper,
annotatePaper,
clearPaperAnnotation,
getPaper,
listPaperAnnotations,
readPaperCode,
searchPapers,
} from "@companion-ai/alpha-hub/lib";
import type { ExtensionAPI } from "@mariozechner/pi-coding-agent";
import { Type } from "@sinclair/typebox";
function formatText(value: unknown): string {
if (typeof value === "string") return value;
return JSON.stringify(value, null, 2);
}
export function registerAlphaTools(pi: ExtensionAPI): void {
pi.registerTool({
name: "alpha_search",
label: "Alpha Search",
description:
"Search research papers through alphaXiv. Modes: semantic (default, use 2-3 sentence queries), keyword (exact terms), agentic (broad multi-turn retrieval), both, or all.",
parameters: Type.Object({
query: Type.String({ description: "Search query." }),
mode: Type.Optional(
Type.String({ description: "Search mode: semantic, keyword, both, agentic, or all." }),
),
}),
async execute(_toolCallId, params) {
const result = await searchPapers(params.query, params.mode?.trim() || "semantic");
return { content: [{ type: "text", text: formatText(result) }], details: result };
},
});
pi.registerTool({
name: "alpha_get_paper",
label: "Alpha Get Paper",
description: "Fetch a paper's AI-generated report (or raw full text) plus any local annotation.",
parameters: Type.Object({
paper: Type.String({ description: "arXiv ID, arXiv URL, or alphaXiv URL." }),
fullText: Type.Optional(Type.Boolean({ description: "Return raw full text instead of AI report." })),
}),
async execute(_toolCallId, params) {
const result = await getPaper(params.paper, { fullText: params.fullText });
return { content: [{ type: "text", text: formatText(result) }], details: result };
},
});
pi.registerTool({
name: "alpha_ask_paper",
label: "Alpha Ask Paper",
description: "Ask a targeted question about a paper. Uses AI to analyze the PDF and answer.",
parameters: Type.Object({
paper: Type.String({ description: "arXiv ID, arXiv URL, or alphaXiv URL." }),
question: Type.String({ description: "Question about the paper." }),
}),
async execute(_toolCallId, params) {
const result = await askPaper(params.paper, params.question);
return { content: [{ type: "text", text: formatText(result) }], details: result };
},
});
pi.registerTool({
name: "alpha_annotate_paper",
label: "Alpha Annotate Paper",
description: "Write or clear a persistent local annotation for a paper.",
parameters: Type.Object({
paper: Type.String({ description: "Paper ID (arXiv ID or URL)." }),
note: Type.Optional(Type.String({ description: "Annotation text. Omit when clear=true." })),
clear: Type.Optional(Type.Boolean({ description: "Clear the existing annotation." })),
}),
async execute(_toolCallId, params) {
const result = params.clear
? await clearPaperAnnotation(params.paper)
: params.note
? await annotatePaper(params.paper, params.note)
: (() => { throw new Error("Provide either note or clear=true."); })();
return { content: [{ type: "text", text: formatText(result) }], details: result };
},
});
pi.registerTool({
name: "alpha_list_annotations",
label: "Alpha List Annotations",
description: "List all persistent local paper annotations.",
parameters: Type.Object({}),
async execute() {
const result = await listPaperAnnotations();
return { content: [{ type: "text", text: formatText(result) }], details: result };
},
});
pi.registerTool({
name: "alpha_read_code",
label: "Alpha Read Code",
description: "Read files from a paper's GitHub repository. Use '/' for repo overview.",
parameters: Type.Object({
githubUrl: Type.String({ description: "GitHub repository URL." }),
path: Type.Optional(Type.String({ description: "File or directory path. Default: '/'" })),
}),
async execute(_toolCallId, params) {
const result = await readPaperCode(params.githubUrl, params.path?.trim() || "/");
return { content: [{ type: "text", text: formatText(result) }], details: result };
},
});
}

6
package-lock.json generated
View File

@@ -1,12 +1,12 @@
{
"name": "@companion-ai/feynman",
"version": "0.2.14",
"version": "0.2.15",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "@companion-ai/feynman",
"version": "0.2.14",
"version": "0.2.15",
"license": "MIT",
"dependencies": {
"@companion-ai/alpha-hub": "^0.1.2",
@@ -24,7 +24,7 @@
"typescript": "^5.9.3"
},
"engines": {
"node": ">=20.18.1"
"node": ">=20.19.0"
}
},
"node_modules/@anthropic-ai/sdk": {

View File

@@ -1,11 +1,11 @@
{
"name": "@companion-ai/feynman",
"version": "0.2.14",
"version": "0.2.15",
"description": "Research-first CLI agent built on Pi and alphaXiv",
"license": "MIT",
"type": "module",
"engines": {
"node": ">=20.18.1"
"node": ">=20.19.0"
},
"bin": {
"feynman": "bin/feynman.js"
@@ -26,14 +26,16 @@
"scripts/",
"skills/",
"AGENTS.md",
"CONTRIBUTING.md",
"README.md",
".env.example"
],
"scripts": {
"preinstall": "node ./scripts/check-node-version.mjs",
"build": "tsc -p tsconfig.build.json",
"build:native-bundle": "node ./scripts/build-native-bundle.mjs",
"dev": "tsx src/index.ts",
"prepack": "npm run build && node ./scripts/prepare-runtime-workspace.mjs",
"prepack": "node ./scripts/clean-publish-artifacts.mjs && npm run build && node ./scripts/prepare-runtime-workspace.mjs",
"start": "tsx src/index.ts",
"start:dist": "node ./bin/feynman.js",
"test": "node --import tsx --test --test-concurrency=1 tests/*.test.ts",

266
prompts/valichord.md Normal file
View File

@@ -0,0 +1,266 @@
---
description: Submit a replication as a cryptographically verified ValiChord attestation, discover studies awaiting independent validation, query Harmony Records and reproducibility badges, or assist researchers in preparing a study for the validation pipeline.
section: Research Workflows
topLevelCli: true
---
# ValiChord Validation Workflow
ValiChord is a distributed peer-to-peer system for scientific reproducibility verification, built on Holochain. It implements a blind commit-reveal protocol in Rust across four DNAs, producing Harmony Records — immutable, cryptographically verifiable proofs that independent parties reproduced the same findings without coordinating. Verified studies receive automatic reproducibility badges (Gold/Silver/Bronze); validators accumulate a per-discipline reputation score across rounds.
This workflow integrates Feynman at three levels: as a **validator agent** running the full commit-reveal protocol; as a **researcher's assistant** helping prepare a study for submission; and as a **query tool** surfacing reproducibility status during research.
**Live demo of the commit-reveal protocol**: https://youtu.be/DQ5wZSD1YEw
---
## ValiChord's four-DNA architecture
| DNA | Name | Type | Role |
|-----|------|------|------|
| 1 | Researcher Repository | Private, single-agent | Researcher's local archive. Stores study, pre-registered protocol, data snapshots, deviation declarations. Only SHA-256 hashes ever leave this DNA. |
| 2 | Validator Workspace | Private, single-agent | Feynman's working space. Stores task privately. Seals the blind commitment here — content never propagates to the DHT. |
| 3 | Attestation | Shared DHT | Coordination layer. Manages validation requests, validator profiles, study claims, commitment anchors, phase markers, and public attestations. 36 zome functions. |
| 4 | Governance | Public DHT | Final record layer. Assembles HarmonyRecords, issues reproducibility badges, tracks validator reputation, records governance decisions. All read functions accessible via HTTP Gateway without running a node. |
The key guarantee: a validator's findings are cryptographically sealed (`SHA-256(msgpack(attestation) || nonce)`) before the reveal phase opens. Neither party can adjust findings after seeing the other's results. The researcher runs a parallel commit-reveal — locking their expected results before the validators reveal — so no party can adapt to seeing the other's outcome.
---
## Workflow A: Feynman as validator agent
### Step 0: Publish validator profile (one-time setup)
On first use, publish Feynman's public profile to DNA 3 so it appears in validator discovery indexes and conflict-of-interest checks:
```
publish_validator_profile(profile: ValidatorProfile)
```
Key fields:
- `agent_type``AutomatedTool` (AI agents are first-class validators; the protocol makes no distinction between human and machine validators)
- `disciplines` — list of disciplines Feynman can validate (e.g. ComputationalBiology, Statistics)
- `certification_tier` — starts as `Provisional`; advances to `Certified` after 5+ validations with ≥60% agreement rate, `Senior` after 20+ with ≥80%
If a profile already exists, use `update_validator_profile` to merge changes.
### Step 1: Gather inputs or discover study
**If the user provides a `request_ref`**: use it directly.
**If Feynman is proactively discovering work**: query the pending queue in DNA 3:
```
get_pending_requests_for_discipline(discipline: Discipline)
```
Returns all unclaimed `ValidationRequest` entries for the discipline. Each contains:
- `data_hash` — the ExternalHash identifier (used as `request_ref` throughout)
- `num_validators_required` — quorum needed to close the round
- `validation_tier` — Basic / Enhanced / Comprehensive
- `access_urls` — where to fetch the data and code
Optionally assess study complexity before committing:
```
assess_difficulty(input: AssessDifficultyInput)
```
Scores code volume, dependency count, documentation quality, data accessibility, and environment complexity. Returns predicted duration and confidence. Use this to decide whether to proceed before claiming.
If replication results are not yet available, suggest `/replicate` first.
### Step 2: Claim the study
Before receiving a formal task assignment, register intent to validate via DNA 3:
```
claim_study(request_ref: ExternalHash)
```
This:
- Reserves a validator slot (enforced capacity: no over-subscription)
- Triggers conflict-of-interest check — rejects claim if Feynman's institution matches the researcher's
- Records a `StudyClaim` entry on the shared DHT
If a claimed validator goes dark, any other validator can free the slot:
```
reclaim_abandoned_claim(input: ReclaimInput)
```
### Step 3: Receive task and seal private attestation — Commit phase
Connect to the ValiChord conductor via AppWebSocket. Using DNA 2 (Validator Workspace):
```
receive_task(request_ref, discipline, deadline_secs, validation_focus, time_cap_secs, compensation_tier)
```
`validation_focus` specifies which aspect Feynman is validating:
- `ComputationalReproducibility` — re-run code, check numerical outputs
- `PreCommitmentAdherence` — verify results match pre-registered analysis plan
- `MethodologicalReview` — assess statistical choices and protocol validity
Then seal the private attestation — this is the blind commitment:
```
seal_private_attestation(task_hash, attestation)
```
Where `attestation` includes:
- `outcome``Reproduced` / `PartiallyReproduced` / `FailedToReproduce` / `UnableToAssess`
- `outcome_summary` — key metrics, effect direction, confidence interval overlap, overall agreement
- `confidence` — High / Medium / Low
- `time_invested_secs` and `time_breakdown` — environment_setup, data_acquisition, code_execution, troubleshooting
- `computational_resources` — whether personal hardware, HPC, GPU, or cloud was required; estimated cost in pence
- `deviation_flags` — any undeclared departures from the original protocol (type, severity, evidence)
The coordinator computes `commitment_hash = SHA-256(msgpack(attestation) || nonce)` and writes a `CommitmentAnchor` to DNA 3's shared DHT. The attestation content remains private in DNA 2.
Save `task_hash` and `commitment_hash` to `outputs/<slug>-valichord-commit.json`.
### Step 4: Wait for RevealOpen phase
Poll DNA 3 (Attestation) until the phase transitions:
```
get_current_phase(request_ref: ExternalHash)
```
Returns `null` (still commit phase), `"RevealOpen"`, or `"Complete"`. Poll every 30 seconds. The phase opens automatically when the `CommitmentAnchor` count reaches `num_validators_required` — no manual trigger required.
During this wait, the researcher also runs their parallel commit-reveal: they lock their expected results via `publish_researcher_commitment` before the reveal phase opens, then reveal via `reveal_researcher_result` after all validators have submitted. No party — researcher or validator — can adapt to seeing the other's outcome.
### Step 5: Submit attestation — Reveal phase
When phase is `RevealOpen`, publish the full attestation to the shared DHT via DNA 3:
```
submit_attestation(attestation, nonce)
```
The coordinator verifies `SHA-256(msgpack(attestation) || nonce) == CommitmentAnchor.commitment_hash` before writing. This prevents adaptive reveals — the attestation must match exactly what was committed.
### Step 6: Retrieve Harmony Record and badges
Call DNA 4 (Governance) explicitly after `submit_attestation` returns — DHT propagation means the ValidatorToAttestation link may not be visible within the same transaction:
```
check_and_create_harmony_record(request_ref)
get_harmony_record(request_ref)
get_badges_for_study(request_ref)
```
The **Harmony Record** contains:
- `outcome` — the majority reproduced/not-reproduced finding
- `agreement_level` — ExactMatch / WithinTolerance / DirectionalMatch / Divergent / UnableToAssess
- `participating_validators` — array of validator agent keys
- `validation_duration_secs`
- `ActionHash` — the immutable on-chain identifier
**Reproducibility badges** are automatically issued when the Harmony Record is created:
| Badge | Threshold |
|-------|-----------|
| GoldReproducible | ≥7 validators, ≥90% agreement |
| SilverReproducible | ≥5 validators, ≥70% agreement |
| BronzeReproducible | ≥3 validators, ≥50% agreement |
| FailedReproduction | Divergent outcomes |
Save the full record and badges to `outputs/<slug>-harmony-record.json`.
### Step 7: Check updated reputation
After each validation round, Feynman's reputation record in DNA 4 is updated:
```
get_validator_reputation(validator: AgentPubKey)
```
Returns per-discipline scores: total validations, agreement rate, average time, and current `CertificationTier` (Provisional → Certified → Senior). Reputation is a long-term asset — AI validators accumulate a cryptographically verifiable track record across all ValiChord rounds they participate in.
### Step 8: Report to user
Present:
- Outcome and agreement level
- Reproducibility badge(s) issued to the study
- Feynman's updated reputation score for this discipline
- ActionHash — the permanent public identifier for this Harmony Record
- Confirmation that the record is written to the Governance DHT and accessible via HTTP Gateway without any special infrastructure
- Path to saved outputs
---
## Workflow B: Query existing Harmony Record
`get_harmony_record` and `get_badges_for_study` in DNA 4 are `Unrestricted` functions — accessible via Holochain's HTTP Gateway without connecting to a conductor or running a node.
```
GET <http_gateway_url>/get_harmony_record/<request_ref_b64>
GET <http_gateway_url>/get_badges_for_study/<request_ref_b64>
```
Use this to:
- Check reproducibility status of a cited study during `/deepresearch`
- Surface Harmony Records and badges in research summaries
- Verify whether a study has undergone independent validation before recommending it
The following read functions are also unrestricted on DNA 3:
`get_attestations_for_request`, `get_validators_for_discipline`, `get_pending_requests_for_discipline`, `get_validator_profile`, `get_current_phase`, `get_difficulty_assessment`, `get_researcher_reveal`
---
## Workflow C: Proactive discipline queue monitoring
Feynman can act as a standing validator for a discipline — periodically checking for new studies that need validation without waiting to be assigned:
```
get_pending_requests_for_discipline(discipline: Discipline)
```
Returns all unclaimed `ValidationRequest` entries. For each, optionally run `assess_difficulty` to estimate workload before claiming.
This enables Feynman to operate as an autonomous reproducibility agent: polling the queue, assessing difficulty, claiming appropriate studies, and running the full Workflow A cycle unsupervised.
---
## Workflow D: Researcher preparation assistant
Before a study enters the validation pipeline, Feynman can assist the researcher in preparing it via DNA 1 (Researcher Repository). This workflow runs on the researcher's side, not the validator's.
**Register the study:**
```
register_study(study: ResearchStudy)
```
**Pre-register the analysis protocol** (immutable once written — creates a tamper-evident commitment to the analysis plan before data collection or validation begins):
```
register_protocol(input: RegisterProtocolInput)
```
**Take a cryptographic data snapshot** (records a SHA-256 hash of the dataset at a point in time — proves data was not modified after validation began):
```
take_data_snapshot(input: TakeDataSnapshotInput)
```
**Declare any deviations** from the pre-registered plan before the commit phase opens (pre-commit transparency):
```
declare_deviation(input: DeclareDeviationInput)
```
Only hashes ever leave DNA 1 — the raw data and protocol text remain on the researcher's device.
**Repository Readiness Checker**: ValiChord also ships a standalone audit tool that scans a research repository for 30+ reproducibility failure modes before submission — missing dependency files, absolute paths, undeclared environment requirements, data documentation gaps, human-subjects data exposure risks, and more. Feynman is the natural interface for this tool: running the audit, interpreting findings in plain language, guiding the researcher through fixes, and confirming the repository meets the bar for independent validation. See: https://github.com/topeuph-ai/ValiChord
---
## Notes
- AI agents are first-class participants in ValiChord's protocol. Feynman can autonomously publish profiles, claim studies, seal attestations, wait for phase transitions, and submit reveals — the protocol makes no distinction between human and AI validators.
- ValiChord's privacy guarantee is structural, not policy-based. DNA 1 (researcher data) and DNA 2 (validator workspace) are single-agent private DHTs — propagation to the shared network is architecturally impossible, not merely restricted.
- All 72 zome functions across the four DNAs are callable via AppWebSocket. The 20+ `Unrestricted` read functions on DNA 3 and DNA 4 are additionally accessible via HTTP Gateway without any Holochain node.
- If a validation round stalls due to validator dropout, `force_finalize_round` in DNA 4 closes it after a 7-day timeout with a reduced quorum, preventing indefinite blocking.
- Live demo (full commit-reveal cycle, Harmony Record generated): https://youtu.be/DQ5wZSD1YEw
- Running the demo: `bash demo/start.sh` in a GitHub Codespace, then open port 8888 publicly
- ValiChord repo: https://github.com/topeuph-ai/ValiChord

View File

@@ -6,13 +6,45 @@ import { spawnSync } from "node:child_process";
const appRoot = resolve(import.meta.dirname, "..");
const packageJson = JSON.parse(readFileSync(resolve(appRoot, "package.json"), "utf8"));
const packageLockPath = resolve(appRoot, "package-lock.json");
const bundledNodeVersion = process.env.FEYNMAN_BUNDLED_NODE_VERSION ?? process.version.slice(1);
const minBundledNodeVersion = packageJson.engines?.node?.replace(/^>=/, "").trim() || process.version.slice(1);
function parseSemver(version) {
const [major = "0", minor = "0", patch = "0"] = version.split(".");
return [Number.parseInt(major, 10) || 0, Number.parseInt(minor, 10) || 0, Number.parseInt(patch, 10) || 0];
}
function compareSemver(left, right) {
for (let index = 0; index < 3; index += 1) {
const diff = left[index] - right[index];
if (diff !== 0) return diff;
}
return 0;
}
function fail(message) {
console.error(`[feynman] ${message}`);
process.exit(1);
}
function resolveBundledNodeVersion() {
const requestedNodeVersion = process.env.FEYNMAN_BUNDLED_NODE_VERSION?.trim();
if (requestedNodeVersion) {
if (compareSemver(parseSemver(requestedNodeVersion), parseSemver(minBundledNodeVersion)) < 0) {
fail(
`FEYNMAN_BUNDLED_NODE_VERSION=${requestedNodeVersion} is below the supported floor ${minBundledNodeVersion}`,
);
}
return requestedNodeVersion;
}
const currentNodeVersion = process.version.slice(1);
return compareSemver(parseSemver(currentNodeVersion), parseSemver(minBundledNodeVersion)) < 0
? minBundledNodeVersion
: currentNodeVersion;
}
const bundledNodeVersion = resolveBundledNodeVersion();
function resolveCommand(command) {
if (process.platform === "win32" && command === "npm") {
return "npm.cmd";

View File

@@ -0,0 +1,40 @@
const MIN_NODE_VERSION = "20.19.0";
function parseNodeVersion(version) {
const [major = "0", minor = "0", patch = "0"] = version.replace(/^v/, "").split(".");
return {
major: Number.parseInt(major, 10) || 0,
minor: Number.parseInt(minor, 10) || 0,
patch: Number.parseInt(patch, 10) || 0,
};
}
function compareNodeVersions(left, right) {
if (left.major !== right.major) return left.major - right.major;
if (left.minor !== right.minor) return left.minor - right.minor;
return left.patch - right.patch;
}
function isSupportedNodeVersion(version = process.versions.node) {
return compareNodeVersions(parseNodeVersion(version), parseNodeVersion(MIN_NODE_VERSION)) >= 0;
}
function getUnsupportedNodeVersionLines(version = process.versions.node) {
const isWindows = process.platform === "win32";
return [
`feynman requires Node.js ${MIN_NODE_VERSION} or later (detected ${version}).`,
isWindows
? "Install a newer Node.js from https://nodejs.org, or use the standalone installer:"
: "Switch to Node 20 with `nvm install 20 && nvm use 20`, or use the standalone installer:",
isWindows
? "irm https://feynman.is/install.ps1 | iex"
: "curl -fsSL https://feynman.is/install | bash",
];
}
if (!isSupportedNodeVersion()) {
for (const line of getUnsupportedNodeVersionLines()) {
console.error(line);
}
process.exit(1);
}

View File

@@ -0,0 +1,8 @@
import { rmSync } from "node:fs";
import { resolve } from "node:path";
const appRoot = resolve(import.meta.dirname, "..");
const releaseDir = resolve(appRoot, "dist", "release");
rmSync(releaseDir, { recursive: true, force: true });
console.log("[feynman] removed dist/release before npm pack/publish");

View File

@@ -0,0 +1,123 @@
param(
[string]$Version = "latest",
[ValidateSet("User", "Repo")]
[string]$Scope = "User",
[string]$TargetDir = ""
)
$ErrorActionPreference = "Stop"
function Normalize-Version {
param([string]$RequestedVersion)
if (-not $RequestedVersion) {
return "latest"
}
switch ($RequestedVersion.ToLowerInvariant()) {
"latest" { return "latest" }
"stable" { return "latest" }
"edge" { throw "The edge channel has been removed. Use the default installer for the latest tagged release or pass an exact version." }
default { return $RequestedVersion.TrimStart("v") }
}
}
function Resolve-LatestReleaseVersion {
$page = Invoke-WebRequest -Uri "https://github.com/getcompanion-ai/feynman/releases/latest"
$match = [regex]::Match($page.Content, 'releases/tag/v([0-9][^"''<>\s]*)')
if (-not $match.Success) {
throw "Failed to resolve the latest Feynman release version."
}
return $match.Groups[1].Value
}
function Resolve-VersionMetadata {
param([string]$RequestedVersion)
$normalizedVersion = Normalize-Version -RequestedVersion $RequestedVersion
if ($normalizedVersion -eq "latest") {
$resolvedVersion = Resolve-LatestReleaseVersion
} else {
$resolvedVersion = $normalizedVersion
}
return [PSCustomObject]@{
ResolvedVersion = $resolvedVersion
GitRef = "v$resolvedVersion"
DownloadUrl = "https://github.com/getcompanion-ai/feynman/archive/refs/tags/v$resolvedVersion.zip"
}
}
function Resolve-InstallDir {
param(
[string]$ResolvedScope,
[string]$ResolvedTargetDir
)
if ($ResolvedTargetDir) {
return $ResolvedTargetDir
}
if ($ResolvedScope -eq "Repo") {
return Join-Path (Get-Location) ".agents\skills\feynman"
}
$codexHome = if ($env:CODEX_HOME) { $env:CODEX_HOME } else { Join-Path $HOME ".codex" }
return Join-Path $codexHome "skills\feynman"
}
$metadata = Resolve-VersionMetadata -RequestedVersion $Version
$resolvedVersion = $metadata.ResolvedVersion
$downloadUrl = $metadata.DownloadUrl
$installDir = Resolve-InstallDir -ResolvedScope $Scope -ResolvedTargetDir $TargetDir
$tmpDir = Join-Path ([System.IO.Path]::GetTempPath()) ("feynman-skills-install-" + [System.Guid]::NewGuid().ToString("N"))
New-Item -ItemType Directory -Path $tmpDir | Out-Null
try {
$archivePath = Join-Path $tmpDir "feynman-skills.zip"
$extractDir = Join-Path $tmpDir "extract"
Write-Host "==> Downloading Feynman skills $resolvedVersion"
Invoke-WebRequest -Uri $downloadUrl -OutFile $archivePath
Write-Host "==> Extracting skills"
Expand-Archive -LiteralPath $archivePath -DestinationPath $extractDir -Force
$sourceRoot = Get-ChildItem -Path $extractDir -Directory | Select-Object -First 1
if (-not $sourceRoot) {
throw "Could not find extracted Feynman archive."
}
$skillsSource = Join-Path $sourceRoot.FullName "skills"
if (-not (Test-Path $skillsSource)) {
throw "Could not find skills/ in downloaded archive."
}
$installParent = Split-Path $installDir -Parent
if ($installParent) {
New-Item -ItemType Directory -Path $installParent -Force | Out-Null
}
if (Test-Path $installDir) {
Remove-Item -Recurse -Force $installDir
}
New-Item -ItemType Directory -Path $installDir -Force | Out-Null
Copy-Item -Path (Join-Path $skillsSource "*") -Destination $installDir -Recurse -Force
Write-Host "==> Installed skills to $installDir"
if ($Scope -eq "Repo") {
Write-Host "Repo-local skills will be discovered automatically from .agents/skills."
} else {
Write-Host "User-level skills will be discovered from `$CODEX_HOME/skills."
}
Write-Host "Feynman skills $resolvedVersion installed successfully."
} finally {
if (Test-Path $tmpDir) {
Remove-Item -Recurse -Force $tmpDir
}
}

View File

@@ -0,0 +1,204 @@
#!/bin/sh
set -eu
VERSION="latest"
SCOPE="${FEYNMAN_SKILLS_SCOPE:-user}"
TARGET_DIR="${FEYNMAN_SKILLS_DIR:-}"
step() {
printf '==> %s\n' "$1"
}
normalize_version() {
case "$1" in
"")
printf 'latest\n'
;;
latest | stable)
printf 'latest\n'
;;
edge)
echo "The edge channel has been removed. Use the default installer for the latest tagged release or pass an exact version." >&2
exit 1
;;
v*)
printf '%s\n' "${1#v}"
;;
*)
printf '%s\n' "$1"
;;
esac
}
download_file() {
url="$1"
output="$2"
if command -v curl >/dev/null 2>&1; then
if [ -t 2 ]; then
curl -fL --progress-bar "$url" -o "$output"
else
curl -fsSL "$url" -o "$output"
fi
return
fi
if command -v wget >/dev/null 2>&1; then
if [ -t 2 ]; then
wget --show-progress -O "$output" "$url"
else
wget -q -O "$output" "$url"
fi
return
fi
echo "curl or wget is required to install Feynman skills." >&2
exit 1
}
download_text() {
url="$1"
if command -v curl >/dev/null 2>&1; then
curl -fsSL "$url"
return
fi
if command -v wget >/dev/null 2>&1; then
wget -q -O - "$url"
return
fi
echo "curl or wget is required to install Feynman skills." >&2
exit 1
}
resolve_version() {
normalized_version="$(normalize_version "$VERSION")"
if [ "$normalized_version" = "latest" ]; then
release_page="$(download_text "https://github.com/getcompanion-ai/feynman/releases/latest")"
resolved_version="$(printf '%s\n' "$release_page" | sed -n 's@.*releases/tag/v\([0-9][^"<>[:space:]]*\).*@\1@p' | head -n 1)"
if [ -z "$resolved_version" ]; then
echo "Failed to resolve the latest Feynman release version." >&2
exit 1
fi
printf '%s\nv%s\n' "$resolved_version" "$resolved_version"
return
fi
printf '%s\nv%s\n' "$normalized_version" "$normalized_version"
}
resolve_target_dir() {
if [ -n "$TARGET_DIR" ]; then
printf '%s\n' "$TARGET_DIR"
return
fi
case "$SCOPE" in
repo)
printf '%s/.agents/skills/feynman\n' "$PWD"
;;
user)
codex_home="${CODEX_HOME:-$HOME/.codex}"
printf '%s/skills/feynman\n' "$codex_home"
;;
*)
echo "Unknown scope: $SCOPE (expected --user or --repo)" >&2
exit 1
;;
esac
}
while [ $# -gt 0 ]; do
case "$1" in
--repo)
SCOPE="repo"
;;
--user)
SCOPE="user"
;;
--dir)
if [ $# -lt 2 ]; then
echo "Usage: install-skills.sh [stable|latest|<version>] [--user|--repo] [--dir <path>]" >&2
exit 1
fi
TARGET_DIR="$2"
shift
;;
edge|stable|latest|v*|[0-9]*)
VERSION="$1"
;;
*)
echo "Unknown argument: $1" >&2
echo "Usage: install-skills.sh [stable|latest|<version>] [--user|--repo] [--dir <path>]" >&2
exit 1
;;
esac
shift
done
archive_metadata="$(resolve_version)"
resolved_version="$(printf '%s\n' "$archive_metadata" | sed -n '1p')"
git_ref="$(printf '%s\n' "$archive_metadata" | sed -n '2p')"
archive_url=""
case "$git_ref" in
main)
archive_url="https://github.com/getcompanion-ai/feynman/archive/refs/heads/main.tar.gz"
;;
v*)
archive_url="https://github.com/getcompanion-ai/feynman/archive/refs/tags/${git_ref}.tar.gz"
;;
esac
if [ -z "$archive_url" ]; then
echo "Could not resolve a download URL for ref: $git_ref" >&2
exit 1
fi
install_dir="$(resolve_target_dir)"
step "Installing Feynman skills ${resolved_version} (${SCOPE})"
tmp_dir="$(mktemp -d)"
cleanup() {
rm -rf "$tmp_dir"
}
trap cleanup EXIT INT TERM
archive_path="$tmp_dir/feynman-skills.tar.gz"
step "Downloading skills archive"
download_file "$archive_url" "$archive_path"
extract_dir="$tmp_dir/extract"
mkdir -p "$extract_dir"
step "Extracting skills"
tar -xzf "$archive_path" -C "$extract_dir"
source_root="$(find "$extract_dir" -mindepth 1 -maxdepth 1 -type d | head -n 1)"
if [ -z "$source_root" ] || [ ! -d "$source_root/skills" ]; then
echo "Could not find skills/ in downloaded archive." >&2
exit 1
fi
mkdir -p "$(dirname "$install_dir")"
rm -rf "$install_dir"
mkdir -p "$install_dir"
cp -R "$source_root/skills/." "$install_dir/"
step "Installed skills to $install_dir"
case "$SCOPE" in
repo)
step "Repo-local skills will be discovered automatically from .agents/skills"
;;
user)
step "User-level skills will be discovered from \$CODEX_HOME/skills"
;;
esac
printf 'Feynman skills %s installed successfully.\n' "$resolved_version"

View File

@@ -1,5 +1,5 @@
param(
[string]$Version = "edge"
[string]$Version = "latest"
)
$ErrorActionPreference = "Stop"
@@ -8,17 +8,27 @@ function Normalize-Version {
param([string]$RequestedVersion)
if (-not $RequestedVersion) {
return "edge"
return "latest"
}
switch ($RequestedVersion.ToLowerInvariant()) {
"edge" { return "edge" }
"latest" { return "latest" }
"stable" { return "latest" }
"edge" { throw "The edge channel has been removed. Use the default installer for the latest tagged release or pass an exact version." }
default { return $RequestedVersion.TrimStart("v") }
}
}
function Resolve-LatestReleaseVersion {
$page = Invoke-WebRequest -Uri "https://github.com/getcompanion-ai/feynman/releases/latest"
$match = [regex]::Match($page.Content, 'releases/tag/v([0-9][^"''<>\s]*)')
if (-not $match.Success) {
throw "Failed to resolve the latest Feynman release version."
}
return $match.Groups[1].Value
}
function Resolve-ReleaseMetadata {
param(
[string]$RequestedVersion,
@@ -28,34 +38,8 @@ function Resolve-ReleaseMetadata {
$normalizedVersion = Normalize-Version -RequestedVersion $RequestedVersion
if ($normalizedVersion -eq "edge") {
$release = Invoke-RestMethod -Uri "https://api.github.com/repos/getcompanion-ai/feynman/releases/tags/edge"
$asset = $release.assets | Where-Object { $_.name -like "feynman-*-$AssetTarget.$BundleExtension" } | Select-Object -First 1
if (-not $asset) {
throw "Failed to resolve the latest Feynman edge bundle."
}
$archiveName = $asset.name
$suffix = ".$BundleExtension"
$bundleName = $archiveName.Substring(0, $archiveName.Length - $suffix.Length)
$resolvedVersion = $bundleName.Substring("feynman-".Length)
$resolvedVersion = $resolvedVersion.Substring(0, $resolvedVersion.Length - ("-$AssetTarget").Length)
return [PSCustomObject]@{
ResolvedVersion = $resolvedVersion
BundleName = $bundleName
ArchiveName = $archiveName
DownloadUrl = $asset.browser_download_url
}
}
if ($normalizedVersion -eq "latest") {
$release = Invoke-RestMethod -Uri "https://api.github.com/repos/getcompanion-ai/feynman/releases/latest"
if (-not $release.tag_name) {
throw "Failed to resolve the latest Feynman release version."
}
$resolvedVersion = $release.tag_name.TrimStart("v")
$resolvedVersion = Resolve-LatestReleaseVersion
} else {
$resolvedVersion = $normalizedVersion
}
@@ -73,12 +57,26 @@ function Resolve-ReleaseMetadata {
}
function Get-ArchSuffix {
$arch = [System.Runtime.InteropServices.RuntimeInformation]::OSArchitecture
switch ($arch.ToString()) {
"X64" { return "x64" }
"Arm64" { return "arm64" }
default { throw "Unsupported architecture: $arch" }
# Prefer PROCESSOR_ARCHITECTURE which is always available on Windows.
# RuntimeInformation::OSArchitecture requires .NET 4.7.1+ and may not
# be loaded in every Windows PowerShell 5.1 session.
$envArch = $env:PROCESSOR_ARCHITECTURE
if ($envArch) {
switch ($envArch) {
"AMD64" { return "x64" }
"ARM64" { return "arm64" }
}
}
try {
$arch = [System.Runtime.InteropServices.RuntimeInformation]::OSArchitecture
switch ($arch.ToString()) {
"X64" { return "x64" }
"Arm64" { return "arm64" }
}
} catch {}
throw "Unsupported architecture: $envArch"
}
$archSuffix = Get-ArchSuffix
@@ -134,7 +132,11 @@ Workarounds:
"@ | Set-Content -Path $shimPath -Encoding ASCII
$currentUserPath = [Environment]::GetEnvironmentVariable("Path", "User")
if (-not $currentUserPath.Split(';').Contains($installBinDir)) {
$alreadyOnPath = $false
if ($currentUserPath) {
$alreadyOnPath = $currentUserPath.Split(';') -contains $installBinDir
}
if (-not $alreadyOnPath) {
$updatedPath = if ([string]::IsNullOrWhiteSpace($currentUserPath)) {
$installBinDir
} else {
@@ -146,6 +148,16 @@ Workarounds:
Write-Host "$installBinDir is already on PATH."
}
$resolvedCommand = Get-Command feynman -ErrorAction SilentlyContinue
if ($resolvedCommand -and $resolvedCommand.Source -ne $shimPath) {
Write-Warning "Current shell resolves feynman to $($resolvedCommand.Source)"
Write-Host "Run in a new shell, or run: `$env:Path = '$installBinDir;' + `$env:Path"
Write-Host "Then run: feynman"
if ($resolvedCommand.Source -like "*node_modules*@companion-ai*feynman*") {
Write-Host "If that path is an old global npm install, remove it with: npm uninstall -g @companion-ai/feynman"
}
}
Write-Host "Feynman $resolvedVersion installed successfully."
} finally {
if (Test-Path $tmpDir) {

View File

@@ -2,7 +2,7 @@
set -eu
VERSION="${1:-edge}"
VERSION="${1:-latest}"
INSTALL_BIN_DIR="${FEYNMAN_INSTALL_BIN_DIR:-$HOME/.local/bin}"
INSTALL_APP_DIR="${FEYNMAN_INSTALL_APP_DIR:-$HOME/.local/share/feynman}"
SKIP_PATH_UPDATE="${FEYNMAN_INSTALL_SKIP_PATH_UPDATE:-0}"
@@ -54,12 +54,16 @@ run_with_spinner() {
normalize_version() {
case "$1" in
"" | edge)
printf 'edge\n'
"")
printf 'latest\n'
;;
latest | stable)
printf 'latest\n'
;;
edge)
echo "The edge channel has been removed. Use the default installer for the latest tagged release or pass an exact version." >&2
exit 1
;;
v*)
printf '%s\n' "${1#v}"
;;
@@ -160,39 +164,33 @@ require_command() {
fi
}
resolve_release_metadata() {
normalized_version="$(normalize_version "$VERSION")"
warn_command_conflict() {
expected_path="$INSTALL_BIN_DIR/feynman"
resolved_path="$(command -v feynman 2>/dev/null || true)"
if [ "$normalized_version" = "edge" ]; then
release_json="$(download_text "https://api.github.com/repos/getcompanion-ai/feynman/releases/tags/edge")"
asset_url=""
for candidate in $(printf '%s\n' "$release_json" | sed -n 's/.*"browser_download_url":[[:space:]]*"\([^"]*\)".*/\1/p'); do
case "$candidate" in
*/feynman-*-${asset_target}.${archive_extension})
asset_url="$candidate"
break
;;
esac
done
if [ -z "$asset_url" ]; then
echo "Failed to resolve the latest Feynman edge bundle." >&2
exit 1
fi
archive_name="${asset_url##*/}"
bundle_name="${archive_name%.$archive_extension}"
resolved_version="${bundle_name#feynman-}"
resolved_version="${resolved_version%-${asset_target}}"
printf '%s\n%s\n%s\n%s\n' "$resolved_version" "$bundle_name" "$archive_name" "$asset_url"
if [ -z "$resolved_path" ]; then
return
fi
if [ "$resolved_path" != "$expected_path" ]; then
step "Warning: current shell resolves feynman to $resolved_path"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && hash -r && feynman"
step "Or launch directly: $expected_path"
case "$resolved_path" in
*"/node_modules/@companion-ai/feynman/"* | *"/node_modules/.bin/feynman")
step "If that path is an old global npm install, remove it with: npm uninstall -g @companion-ai/feynman"
;;
esac
fi
}
resolve_release_metadata() {
normalized_version="$(normalize_version "$VERSION")"
if [ "$normalized_version" = "latest" ]; then
release_json="$(download_text "https://api.github.com/repos/getcompanion-ai/feynman/releases/latest")"
resolved_version="$(printf '%s\n' "$release_json" | sed -n 's/.*"tag_name":[[:space:]]*"v\([^"]*\)".*/\1/p' | head -n 1)"
release_page="$(download_text "https://github.com/getcompanion-ai/feynman/releases/latest")"
resolved_version="$(printf '%s\n' "$release_page" | sed -n 's@.*releases/tag/v\([0-9][^"<>[:space:]]*\).*@\1@p' | head -n 1)"
if [ -z "$resolved_version" ]; then
echo "Failed to resolve the latest Feynman release version." >&2
@@ -290,20 +288,22 @@ add_to_path
case "$path_action" in
added)
step "PATH updated for future shells in $path_profile"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && feynman"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && hash -r && feynman"
;;
configured)
step "PATH is already configured for future shells in $path_profile"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && feynman"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && hash -r && feynman"
;;
skipped)
step "PATH update skipped"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && feynman"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && hash -r && feynman"
;;
*)
step "$INSTALL_BIN_DIR is already on PATH"
step "Run: feynman"
step "Run: hash -r && feynman"
;;
esac
warn_command_conflict
printf 'Feynman %s installed successfully.\n' "$resolved_version"

View File

@@ -51,6 +51,7 @@ const cliPath = piPackageRoot ? resolve(piPackageRoot, "dist", "cli.js") : null;
const bunCliPath = piPackageRoot ? resolve(piPackageRoot, "dist", "bun", "cli.js") : null;
const interactiveModePath = piPackageRoot ? resolve(piPackageRoot, "dist", "modes", "interactive", "interactive-mode.js") : null;
const interactiveThemePath = piPackageRoot ? resolve(piPackageRoot, "dist", "modes", "interactive", "theme", "theme.js") : null;
const terminalPath = piTuiRoot ? resolve(piTuiRoot, "dist", "terminal.js") : null;
const editorPath = piTuiRoot ? resolve(piTuiRoot, "dist", "components", "editor.js") : null;
const workspaceRoot = resolve(appRoot, ".feynman", "npm", "node_modules");
const webAccessPath = resolve(workspaceRoot, "pi-web-access", "index.ts");
@@ -138,12 +139,18 @@ function restorePackagedWorkspace(packageSpecs) {
timeout: 300000,
});
// On Windows, tar may exit non-zero due to symlink creation failures in
// .bin/ directories. These are non-fatal — check whether the actual
// package directories were extracted successfully.
const packagesPresent = packageSpecs.every((spec) => existsSync(resolve(workspaceRoot, parsePackageName(spec))));
if (packagesPresent) return true;
if (result.status !== 0) {
if (result.stderr?.length) process.stderr.write(result.stderr);
return false;
}
return packageSpecs.every((spec) => existsSync(resolve(workspaceRoot, parsePackageName(spec))));
return false;
}
function refreshPackagedWorkspace(packageSpecs) {
@@ -155,12 +162,18 @@ function resolveExecutable(name, fallbackPaths = []) {
if (existsSync(candidate)) return candidate;
}
const result = spawnSync("sh", ["-lc", `command -v ${name}`], {
encoding: "utf8",
stdio: ["ignore", "pipe", "ignore"],
});
const isWindows = process.platform === "win32";
const result = isWindows
? spawnSync("cmd", ["/c", `where ${name}`], {
encoding: "utf8",
stdio: ["ignore", "pipe", "ignore"],
})
: spawnSync("sh", ["-lc", `command -v ${name}`], {
encoding: "utf8",
stdio: ["ignore", "pipe", "ignore"],
});
if (result.status === 0) {
const resolved = result.stdout.trim();
const resolved = result.stdout.trim().split(/\r?\n/)[0];
if (resolved) return resolved;
}
return null;
@@ -247,10 +260,68 @@ for (const entryPath of [cliPath, bunCliPath].filter(Boolean)) {
continue;
}
const cliSource = readFileSync(entryPath, "utf8");
let cliSource = readFileSync(entryPath, "utf8");
if (cliSource.includes('process.title = "pi";')) {
writeFileSync(entryPath, cliSource.replace('process.title = "pi";', 'process.title = "feynman";'), "utf8");
cliSource = cliSource.replace('process.title = "pi";', 'process.title = "feynman";');
}
const stdinErrorGuard = [
"const feynmanHandleStdinError = (error) => {",
' if (error && typeof error === "object") {',
' const code = "code" in error ? error.code : undefined;',
' const syscall = "syscall" in error ? error.syscall : undefined;',
' if ((code === "EIO" || code === "EBADF") && syscall === "read") {',
" return;",
" }",
" }",
"};",
'process.stdin?.on?.("error", feynmanHandleStdinError);',
].join("\n");
if (!cliSource.includes('process.stdin?.on?.("error", feynmanHandleStdinError);')) {
cliSource = cliSource.replace(
'process.emitWarning = (() => { });',
`process.emitWarning = (() => { });\n${stdinErrorGuard}`,
);
}
writeFileSync(entryPath, cliSource, "utf8");
}
if (terminalPath && existsSync(terminalPath)) {
let terminalSource = readFileSync(terminalPath, "utf8");
if (!terminalSource.includes("stdinErrorHandler;")) {
terminalSource = terminalSource.replace(
" stdinBuffer;\n stdinDataHandler;\n",
[
" stdinBuffer;",
" stdinDataHandler;",
" stdinErrorHandler = (error) => {",
' if ((error?.code === "EIO" || error?.code === "EBADF") && error?.syscall === "read") {',
" return;",
" }",
" };",
].join("\n") + "\n",
);
}
if (!terminalSource.includes('process.stdin.on("error", this.stdinErrorHandler);')) {
terminalSource = terminalSource.replace(
' process.stdin.resume();\n',
' process.stdin.resume();\n process.stdin.on("error", this.stdinErrorHandler);\n',
);
}
if (!terminalSource.includes(' process.stdin.removeListener("error", this.stdinErrorHandler);')) {
terminalSource = terminalSource.replace(
' process.stdin.removeListener("data", onData);\n this.inputHandler = previousHandler;\n',
[
' process.stdin.removeListener("data", onData);',
' process.stdin.removeListener("error", this.stdinErrorHandler);',
' this.inputHandler = previousHandler;',
].join("\n"),
);
terminalSource = terminalSource.replace(
' process.stdin.pause();\n',
' process.stdin.removeListener("error", this.stdinErrorHandler);\n process.stdin.pause();\n',
);
}
writeFileSync(terminalPath, terminalSource, "utf8");
}
if (interactiveModePath && existsSync(interactiveModePath)) {
@@ -482,6 +553,11 @@ if (alphaHubAuthPath && existsSync(alphaHubAuthPath)) {
if (source.includes(oldError)) {
source = source.replace(oldError, newError);
}
const brokenWinOpen = "else if (plat === 'win32') execSync(`start \"${url}\"`);";
const fixedWinOpen = "else if (plat === 'win32') execSync(`cmd /c start \"\" \"${url}\"`);";
if (source.includes(brokenWinOpen)) {
source = source.replace(brokenWinOpen, fixedWinOpen);
}
writeFileSync(alphaHubAuthPath, source, "utf8");
}

View File

@@ -7,5 +7,7 @@ const websitePublicDir = resolve(appRoot, "website", "public");
mkdirSync(websitePublicDir, { recursive: true });
cpSync(resolve(appRoot, "scripts", "install", "install.sh"), resolve(websitePublicDir, "install"));
cpSync(resolve(appRoot, "scripts", "install", "install.ps1"), resolve(websitePublicDir, "install.ps1"));
cpSync(resolve(appRoot, "scripts", "install", "install-skills.sh"), resolve(websitePublicDir, "install-skills"));
cpSync(resolve(appRoot, "scripts", "install", "install-skills.ps1"), resolve(websitePublicDir, "install-skills.ps1"));
console.log("[feynman] synced website installers");

View File

@@ -11,7 +11,7 @@ Use the `alpha` CLI via bash for all paper research operations.
| Command | Description |
|---------|-------------|
| `alpha search "<query>"` | Search papers. Modes: `--mode semantic`, `--mode keyword`, `--mode agentic` |
| `alpha search "<query>"` | Search papers. Prefer `--mode semantic` by default; use `--mode keyword` only for exact-term lookup and `--mode agentic` for broader retrieval. |
| `alpha get <arxiv-id-or-url>` | Fetch paper content and any local annotation |
| `alpha get --full-text <arxiv-id>` | Get raw full text instead of AI report |
| `alpha ask <arxiv-id> "<question>"` | Ask a question about a paper's PDF |
@@ -22,7 +22,7 @@ Use the `alpha` CLI via bash for all paper research operations.
## Auth
Run `alpha login` to authenticate with alphaXiv. Check status with `alpha status`.
Run `alpha login` to authenticate with alphaXiv. Check status with `feynman alpha status`, or `alpha status` once your installed `alpha-hub` version includes it.
## Examples

View File

@@ -0,0 +1,28 @@
---
name: contributing
description: Contribute changes to the Feynman repository itself. Use when the task is to add features, fix bugs, update prompts or skills, change install or release behavior, improve docs, or prepare a focused PR against this repo.
---
# Contributing
Read `CONTRIBUTING.md` first, then `AGENTS.md` for repo-level agent conventions.
Use this skill when working on Feynman itself, especially for:
- CLI or runtime changes in `src/`
- prompt changes in `prompts/`
- bundled skill changes in `skills/`
- subagent behavior changes in `.feynman/agents/`
- install, packaging, or release changes in `scripts/`, `README.md`, or website docs
Minimum local checks before claiming the repo change is done:
```bash
npm test
npm run typecheck
npm run build
```
If the docs site changed, also validate `website/`.
When changing release-sensitive behavior, verify that `.nvmrc`, package `engines`, runtime guards, and install docs stay aligned.

View File

@@ -0,0 +1,19 @@
---
name: valichord-validation
description: Integrate with ValiChord to submit a replication as a cryptographically verified validator attestation, discover studies awaiting independent validation, query Harmony Records and reproducibility badges, or assist researchers in preparing a study for the validation pipeline. Feynman operates as a first-class AI validator — publishing a validator profile, claiming studies, running the blind commit-reveal protocol, and accumulating a verifiable per-discipline reputation. Also surfaces reproducibility status during /deepresearch and literature reviews via ValiChord's HTTP Gateway.
---
# ValiChord Validation
Run the `/valichord` workflow. Read the prompt template at `prompts/valichord.md` for the full procedure.
ValiChord is a four-DNA Holochain system for scientific reproducibility verification. Feynman integrates at four points:
- As a **validator agent** — running `/replicate` then submitting findings as a sealed attestation into the blind commit-reveal protocol, earning reproducibility badges for researchers and building Feynman's own verifiable per-discipline reputation (Provisional → Certified → Senior)
- As a **proactive discovery agent** — querying the pending study queue by discipline, assessing difficulty, and autonomously claiming appropriate validation work without waiting to be assigned
- As a **researcher's assistant** — helping prepare studies for submission: registering protocols, taking cryptographic data snapshots, and running the Repository Readiness Checker to identify and fix reproducibility failure modes before validation begins
- As a **research query tool** — checking whether a study carries a Harmony Record or reproducibility badge (Gold/Silver/Bronze) via ValiChord's HTTP Gateway, for use during `/deepresearch` or literature reviews
Output: a Harmony Record — an immutable, publicly accessible cryptographic proof of independent reproducibility written to the ValiChord Governance DHT — plus automatic badge issuance and an updated validator reputation score.
Live demo (commit-reveal cycle end-to-end): https://youtu.be/DQ5wZSD1YEw
ValiChord repo: https://github.com/topeuph-ai/ValiChord

View File

@@ -11,7 +11,7 @@ import {
login as loginAlpha,
logout as logoutAlpha,
} from "@companion-ai/alpha-hub/lib";
import { AuthStorage, DefaultPackageManager, ModelRegistry, SettingsManager } from "@mariozechner/pi-coding-agent";
import { DefaultPackageManager, SettingsManager } from "@mariozechner/pi-coding-agent";
import { syncBundledAssets } from "./bootstrap/sync.js";
import { ensureFeynmanHome, getDefaultSessionDir, getFeynmanAgentDir, getFeynmanHome } from "./config/paths.js";
@@ -19,6 +19,7 @@ import { launchPiChat } from "./pi/launch.js";
import { CORE_PACKAGE_SOURCES, getOptionalPackagePresetSources, listOptionalPackagePresets } from "./pi/package-presets.js";
import { normalizeFeynmanSettings, normalizeThinkingLevel, parseModelSpec } from "./pi/settings.js";
import {
authenticateModelProvider,
getCurrentModelSpec,
loginModelProvider,
logoutModelProvider,
@@ -30,6 +31,7 @@ import { runDoctor, runStatus } from "./setup/doctor.js";
import { setupPreviewDependencies } from "./setup/preview.js";
import { runSetup } from "./setup/setup.js";
import { ASH, printAsciiHeader, printInfo, printPanel, printSection, RESET, SAGE } from "./ui/terminal.js";
import { createModelRegistry } from "./model/registry.js";
import {
cliCommandSections,
formatCliWorkflowUsage,
@@ -124,7 +126,13 @@ async function handleModelCommand(subcommand: string | undefined, args: string[]
}
if (subcommand === "login") {
await loginModelProvider(feynmanAuthPath, args[0], feynmanSettingsPath);
if (args[0]) {
// Specific provider given - use OAuth login directly
await loginModelProvider(feynmanAuthPath, args[0], feynmanSettingsPath);
} else {
// No provider specified - show auth method choice
await authenticateModelProvider(feynmanAuthPath, feynmanSettingsPath);
}
return;
}
@@ -427,7 +435,7 @@ export async function main(): Promise<void> {
const explicitModelSpec = values.model ?? process.env.FEYNMAN_MODEL;
if (explicitModelSpec) {
const modelRegistry = new ModelRegistry(AuthStorage.create(feynmanAuthPath));
const modelRegistry = createModelRegistry(feynmanAuthPath);
const explicitModel = parseModelSpec(explicitModelSpec, modelRegistry);
if (!explicitModel) {
throw new Error(`Unknown model: ${explicitModelSpec}`);

View File

@@ -1,6 +1,12 @@
import { main } from "./cli.js";
import { ensureSupportedNodeVersion } from "./system/node-version.js";
main().catch((error) => {
async function run(): Promise<void> {
ensureSupportedNodeVersion();
const { main } = await import("./cli.js");
await main();
}
run().catch((error) => {
console.error(error instanceof Error ? error.message : String(error));
process.exitCode = 1;
});

View File

@@ -1,4 +1,4 @@
import { AuthStorage, ModelRegistry } from "@mariozechner/pi-coding-agent";
import { createModelRegistry } from "./registry.js";
type ModelRecord = {
provider: string;
@@ -166,10 +166,6 @@ function sortProviders(left: ProviderStatus, right: ProviderStatus): number {
return left.label.localeCompare(right.label);
}
function createModelRegistry(authPath: string): ModelRegistry {
return new ModelRegistry(AuthStorage.create(authPath));
}
export function getAvailableModelRecords(authPath: string): ModelRecord[] {
return createModelRegistry(authPath)
.getAvailable()
@@ -258,7 +254,9 @@ export function buildModelStatusSnapshotFromRecords(
const guidance: string[] = [];
if (available.length === 0) {
guidance.push("No authenticated Pi models are available yet.");
guidance.push("Run `feynman model login <provider>` or add provider credentials that Pi can see.");
guidance.push(
"Run `feynman model login <provider>` (OAuth) or configure an API key (env var, auth.json, or models.json for custom providers).",
);
guidance.push("After auth is in place, rerun `feynman model list` or `feynman setup model`.");
} else if (!current) {
guidance.push(`No default research model is set. Recommended: ${recommended?.spec}.`);

View File

@@ -1,5 +1,7 @@
import { AuthStorage } from "@mariozechner/pi-coding-agent";
import { writeFileSync } from "node:fs";
import { exec as execCallback } from "node:child_process";
import { promisify } from "node:util";
import { readJson } from "../pi/settings.js";
import { promptChoice, promptText } from "../setup/prompts.js";
@@ -12,6 +14,10 @@ import {
getSupportedModelRecords,
type ModelStatusSnapshot,
} from "./catalog.js";
import { createModelRegistry, getModelsJsonPath } from "./registry.js";
import { upsertProviderBaseUrl, upsertProviderConfig } from "./models-json.js";
const exec = promisify(execCallback);
function collectModelStatus(settingsPath: string, authPath: string): ModelStatusSnapshot {
return buildModelStatusSnapshotFromRecords(
@@ -58,6 +64,453 @@ async function selectOAuthProvider(authPath: string, action: "login" | "logout")
return providers[selection];
}
type ApiKeyProviderInfo = {
id: string;
label: string;
envVar?: string;
};
const API_KEY_PROVIDERS: ApiKeyProviderInfo[] = [
{ id: "__custom__", label: "Custom provider (baseUrl + API key)" },
{ id: "openai", label: "OpenAI Platform API", envVar: "OPENAI_API_KEY" },
{ id: "anthropic", label: "Anthropic API", envVar: "ANTHROPIC_API_KEY" },
{ id: "google", label: "Google Gemini API", envVar: "GEMINI_API_KEY" },
{ id: "openrouter", label: "OpenRouter", envVar: "OPENROUTER_API_KEY" },
{ id: "zai", label: "Z.AI / GLM", envVar: "ZAI_API_KEY" },
{ id: "kimi-coding", label: "Kimi / Moonshot", envVar: "KIMI_API_KEY" },
{ id: "minimax", label: "MiniMax", envVar: "MINIMAX_API_KEY" },
{ id: "minimax-cn", label: "MiniMax (China)", envVar: "MINIMAX_CN_API_KEY" },
{ id: "mistral", label: "Mistral", envVar: "MISTRAL_API_KEY" },
{ id: "groq", label: "Groq", envVar: "GROQ_API_KEY" },
{ id: "xai", label: "xAI", envVar: "XAI_API_KEY" },
{ id: "cerebras", label: "Cerebras", envVar: "CEREBRAS_API_KEY" },
{ id: "vercel-ai-gateway", label: "Vercel AI Gateway", envVar: "AI_GATEWAY_API_KEY" },
{ id: "huggingface", label: "Hugging Face", envVar: "HF_TOKEN" },
{ id: "opencode", label: "OpenCode Zen", envVar: "OPENCODE_API_KEY" },
{ id: "opencode-go", label: "OpenCode Go", envVar: "OPENCODE_API_KEY" },
{ id: "azure-openai-responses", label: "Azure OpenAI (Responses)", envVar: "AZURE_OPENAI_API_KEY" },
];
async function selectApiKeyProvider(): Promise<ApiKeyProviderInfo | undefined> {
const choices = API_KEY_PROVIDERS.map(
(provider) => `${provider.id}${provider.label}${provider.envVar ? ` (${provider.envVar})` : ""}`,
);
choices.push("Cancel");
const selection = await promptChoice("Choose an API-key provider:", choices, 0);
if (selection >= API_KEY_PROVIDERS.length) {
return undefined;
}
return API_KEY_PROVIDERS[selection];
}
type CustomProviderSetup = {
providerId: string;
modelIds: string[];
baseUrl: string;
api: "openai-completions" | "openai-responses" | "anthropic-messages" | "google-generative-ai";
apiKeyConfig: string;
/**
* If true, add `Authorization: Bearer <apiKey>` to requests in addition to
* whatever the API mode uses (useful for proxies that implement /v1/messages
* but expect Bearer auth instead of x-api-key).
*/
authHeader: boolean;
};
function normalizeProviderId(value: string): string {
return value.trim().toLowerCase().replace(/\s+/g, "-");
}
function normalizeModelIds(value: string): string[] {
const items = value
.split(",")
.map((entry) => entry.trim())
.filter(Boolean);
return Array.from(new Set(items));
}
function normalizeBaseUrl(value: string): string {
return value.trim().replace(/\/+$/, "");
}
function normalizeCustomProviderBaseUrl(
api: CustomProviderSetup["api"],
baseUrl: string,
): { baseUrl: string; note?: string } {
const normalized = normalizeBaseUrl(baseUrl);
if (!normalized) {
return { baseUrl: normalized };
}
// Pi expects Anthropic baseUrl without `/v1` (it appends `/v1/messages` internally).
if (api === "anthropic-messages" && /\/v1$/i.test(normalized)) {
return { baseUrl: normalized.replace(/\/v1$/i, ""), note: "Stripped trailing /v1 for Anthropic mode." };
}
return { baseUrl: normalized };
}
function isLocalBaseUrl(baseUrl: string): boolean {
return /^(https?:\/\/)?(localhost|127\.0\.0\.1|0\.0\.0\.0)(:|\/|$)/i.test(baseUrl);
}
async function resolveApiKeyConfig(apiKeyConfig: string): Promise<string | undefined> {
const trimmed = apiKeyConfig.trim();
if (!trimmed) return undefined;
if (trimmed.startsWith("!")) {
const command = trimmed.slice(1).trim();
if (!command) return undefined;
const shell = process.platform === "win32" ? process.env.ComSpec || "cmd.exe" : process.env.SHELL || "/bin/sh";
try {
const { stdout } = await exec(command, { shell, maxBuffer: 1024 * 1024 });
const value = stdout.trim();
return value || undefined;
} catch {
return undefined;
}
}
const envValue = process.env[trimmed];
if (typeof envValue === "string" && envValue.trim()) {
return envValue.trim();
}
// Fall back to literal value.
return trimmed;
}
async function bestEffortFetchOpenAiModelIds(
baseUrl: string,
apiKey: string,
authHeader: boolean,
): Promise<string[] | undefined> {
const url = `${baseUrl}/models`;
const controller = new AbortController();
const timer = setTimeout(() => controller.abort(), 5000);
try {
const response = await fetch(url, {
method: "GET",
headers: authHeader ? { Authorization: `Bearer ${apiKey}` } : undefined,
signal: controller.signal,
});
if (!response.ok) {
return undefined;
}
const json = (await response.json()) as any;
if (!Array.isArray(json?.data)) return undefined;
return json.data
.map((entry: any) => (typeof entry?.id === "string" ? entry.id : undefined))
.filter(Boolean);
} catch {
return undefined;
} finally {
clearTimeout(timer);
}
}
async function promptCustomProviderSetup(): Promise<CustomProviderSetup | undefined> {
printSection("Custom Provider");
const providerIdInput = await promptText("Provider id (e.g. my-proxy)", "custom");
const providerId = normalizeProviderId(providerIdInput);
if (!providerId || providerId === "__custom__") {
printWarning("Invalid provider id.");
return undefined;
}
const apiChoices = [
"openai-completions — OpenAI Chat Completions compatible (e.g. /v1/chat/completions)",
"openai-responses — OpenAI Responses compatible (e.g. /v1/responses)",
"anthropic-messages — Anthropic Messages compatible (e.g. /v1/messages)",
"google-generative-ai — Google Generative AI compatible (generativelanguage.googleapis.com)",
"Cancel",
];
const apiSelection = await promptChoice("API mode:", apiChoices, 0);
if (apiSelection >= 4) {
return undefined;
}
const api = ["openai-completions", "openai-responses", "anthropic-messages", "google-generative-ai"][apiSelection] as CustomProviderSetup["api"];
const baseUrlDefault = ((): string => {
if (api === "openai-completions" || api === "openai-responses") return "http://localhost:11434/v1";
if (api === "anthropic-messages") return "https://api.anthropic.com";
if (api === "google-generative-ai") return "https://generativelanguage.googleapis.com";
return "http://localhost:11434/v1";
})();
const baseUrlPrompt =
api === "openai-completions" || api === "openai-responses"
? "Base URL (include /v1 for OpenAI-compatible endpoints)"
: api === "anthropic-messages"
? "Base URL (no trailing /, no /v1)"
: "Base URL (no trailing /)";
const baseUrlRaw = await promptText(baseUrlPrompt, baseUrlDefault);
const { baseUrl, note: baseUrlNote } = normalizeCustomProviderBaseUrl(api, baseUrlRaw);
if (!baseUrl) {
printWarning("Base URL is required.");
return undefined;
}
if (baseUrlNote) {
printInfo(baseUrlNote);
}
let authHeader = false;
if (api === "openai-completions" || api === "openai-responses") {
const defaultAuthHeader = !isLocalBaseUrl(baseUrl);
const authHeaderChoices = [
"Yes (send Authorization: Bearer <apiKey>)",
"No (common for local Ollama/vLLM/LM Studio)",
"Cancel",
];
const authHeaderSelection = await promptChoice(
"Send Authorization header?",
authHeaderChoices,
defaultAuthHeader ? 0 : 1,
);
if (authHeaderSelection >= 2) {
return undefined;
}
authHeader = authHeaderSelection === 0;
}
if (api === "anthropic-messages") {
const defaultAuthHeader = isLocalBaseUrl(baseUrl);
const authHeaderChoices = [
"Yes (also send Authorization: Bearer <apiKey>)",
"No (standard Anthropic uses x-api-key only)",
"Cancel",
];
const authHeaderSelection = await promptChoice(
"Also send Authorization header?",
authHeaderChoices,
defaultAuthHeader ? 0 : 1,
);
if (authHeaderSelection >= 2) {
return undefined;
}
authHeader = authHeaderSelection === 0;
}
printInfo("API key value supports:");
printInfo(" - literal secret (stored in models.json)");
printInfo(" - env var name (resolved at runtime)");
printInfo(" - !command (executes and uses stdout)");
const apiKeyConfigRaw = (await promptText("API key / resolver", "")).trim();
const apiKeyConfig = apiKeyConfigRaw || "local";
if (!apiKeyConfigRaw) {
printInfo("Using placeholder apiKey value (required by Pi for custom providers).");
}
let modelIdsDefault = "my-model";
if (api === "openai-completions" || api === "openai-responses") {
// Best-effort: hit /models so users can pick correct ids (especially for proxies).
const resolvedKey = await resolveApiKeyConfig(apiKeyConfig);
const modelIds = resolvedKey ? await bestEffortFetchOpenAiModelIds(baseUrl, resolvedKey, authHeader) : undefined;
if (modelIds && modelIds.length > 0) {
const sample = modelIds.slice(0, 10).join(", ");
printInfo(`Detected models: ${sample}${modelIds.length > 10 ? ", ..." : ""}`);
modelIdsDefault = modelIds.includes("sonnet") ? "sonnet" : modelIds[0]!;
}
}
const modelIdsRaw = await promptText("Model id(s) (comma-separated)", modelIdsDefault);
const modelIds = normalizeModelIds(modelIdsRaw);
if (modelIds.length === 0) {
printWarning("At least one model id is required.");
return undefined;
}
return { providerId, modelIds, baseUrl, api, apiKeyConfig, authHeader };
}
async function verifyCustomProvider(setup: CustomProviderSetup, authPath: string): Promise<void> {
const registry = createModelRegistry(authPath);
const modelsError = registry.getError();
if (modelsError) {
printWarning("Verification: models.json failed to load.");
for (const line of modelsError.split("\n")) {
printInfo(` ${line}`);
}
return;
}
const all = registry.getAll();
const hasModel = setup.modelIds.some((id) => all.some((model) => model.provider === setup.providerId && model.id === id));
if (!hasModel) {
printWarning("Verification: model registry does not contain the configured provider/model ids.");
return;
}
const available = registry.getAvailable();
const hasAvailable = setup.modelIds.some((id) =>
available.some((model) => model.provider === setup.providerId && model.id === id),
);
if (!hasAvailable) {
printWarning("Verification: provider is not considered authenticated/available.");
return;
}
const apiKey = await registry.getApiKeyForProvider(setup.providerId);
if (!apiKey) {
printWarning("Verification: API key could not be resolved (check env var name / !command).");
return;
}
const timeoutMs = 8000;
// Best-effort network check for OpenAI-compatible endpoints
if (setup.api === "openai-completions" || setup.api === "openai-responses") {
const url = `${setup.baseUrl}/models`;
const controller = new AbortController();
const timer = setTimeout(() => controller.abort(), timeoutMs);
try {
const response = await fetch(url, {
method: "GET",
headers: setup.authHeader ? { Authorization: `Bearer ${apiKey}` } : undefined,
signal: controller.signal,
});
if (!response.ok) {
printWarning(`Verification: ${url} returned ${response.status} ${response.statusText}`);
return;
}
const json = (await response.json()) as unknown;
const modelIds = Array.isArray((json as any)?.data)
? (json as any).data.map((entry: any) => (typeof entry?.id === "string" ? entry.id : undefined)).filter(Boolean)
: [];
const missing = setup.modelIds.filter((id) => modelIds.length > 0 && !modelIds.includes(id));
if (modelIds.length > 0 && missing.length > 0) {
printWarning(`Verification: /models does not list configured model id(s): ${missing.join(", ")}`);
return;
}
printSuccess("Verification: endpoint reachable and authorized.");
} catch (error) {
printWarning(`Verification: failed to reach ${url}: ${error instanceof Error ? error.message : String(error)}`);
} finally {
clearTimeout(timer);
}
return;
}
if (setup.api === "anthropic-messages") {
const url = `${setup.baseUrl}/v1/models?limit=1`;
const controller = new AbortController();
const timer = setTimeout(() => controller.abort(), timeoutMs);
try {
const headers: Record<string, string> = {
"x-api-key": apiKey,
"anthropic-version": "2023-06-01",
};
if (setup.authHeader) {
headers.Authorization = `Bearer ${apiKey}`;
}
const response = await fetch(url, {
method: "GET",
headers,
signal: controller.signal,
});
if (!response.ok) {
printWarning(`Verification: ${url} returned ${response.status} ${response.statusText}`);
if (response.status === 404) {
printInfo(" Tip: For Anthropic mode, use a base URL without /v1 (e.g. https://api.anthropic.com).");
}
if ((response.status === 401 || response.status === 403) && !setup.authHeader) {
printInfo(" Tip: Some proxies require `Authorization: Bearer <apiKey>` even in Anthropic mode.");
}
return;
}
printSuccess("Verification: endpoint reachable and authorized.");
} catch (error) {
printWarning(`Verification: failed to reach ${url}: ${error instanceof Error ? error.message : String(error)}`);
} finally {
clearTimeout(timer);
}
return;
}
if (setup.api === "google-generative-ai") {
const url = `${setup.baseUrl}/v1beta/models?key=${encodeURIComponent(apiKey)}`;
const controller = new AbortController();
const timer = setTimeout(() => controller.abort(), timeoutMs);
try {
const response = await fetch(url, { method: "GET", signal: controller.signal });
if (!response.ok) {
printWarning(`Verification: ${url} returned ${response.status} ${response.statusText}`);
return;
}
printSuccess("Verification: endpoint reachable and authorized.");
} catch (error) {
printWarning(`Verification: failed to reach ${url}: ${error instanceof Error ? error.message : String(error)}`);
} finally {
clearTimeout(timer);
}
return;
}
printInfo("Verification: skipped network probe for this API mode.");
}
async function configureApiKeyProvider(authPath: string): Promise<boolean> {
const provider = await selectApiKeyProvider();
if (!provider) {
printInfo("API key setup cancelled.");
return false;
}
if (provider.id === "__custom__") {
const setup = await promptCustomProviderSetup();
if (!setup) {
printInfo("Custom provider setup cancelled.");
return false;
}
const modelsJsonPath = getModelsJsonPath(authPath);
const result = upsertProviderConfig(modelsJsonPath, setup.providerId, {
baseUrl: setup.baseUrl,
apiKey: setup.apiKeyConfig,
api: setup.api,
authHeader: setup.authHeader,
models: setup.modelIds.map((id) => ({ id })),
});
if (!result.ok) {
printWarning(result.error);
return false;
}
printSuccess(`Saved custom provider: ${setup.providerId}`);
await verifyCustomProvider(setup, authPath);
return true;
}
printSection(`API Key: ${provider.label}`);
if (provider.envVar) {
printInfo(`Tip: to avoid writing secrets to disk, set ${provider.envVar} in your shell or .env.`);
}
const apiKey = await promptText("Paste API key (leave empty to use env var instead)", "");
if (!apiKey) {
if (provider.envVar) {
printInfo(`Set ${provider.envVar} and rerun setup (or run \`feynman model list\`).`);
} else {
printInfo("No API key provided.");
}
return false;
}
AuthStorage.create(authPath).set(provider.id, { type: "api_key", key: apiKey });
printSuccess(`Saved API key for ${provider.id} in auth storage.`);
const baseUrl = await promptText("Base URL override (optional, include /v1 for OpenAI-compatible endpoints)", "");
if (baseUrl) {
const modelsJsonPath = getModelsJsonPath(authPath);
const result = upsertProviderBaseUrl(modelsJsonPath, provider.id, baseUrl);
if (result.ok) {
printSuccess(`Saved baseUrl override for ${provider.id} in models.json.`);
} else {
printWarning(result.error);
}
}
return true;
}
function resolveAvailableModelSpec(authPath: string, input: string): string | undefined {
const normalizedInput = input.trim().toLowerCase();
if (!normalizedInput) {
@@ -111,14 +564,46 @@ export function printModelList(settingsPath: string, authPath: string): void {
}
}
export async function loginModelProvider(authPath: string, providerId?: string, settingsPath?: string): Promise<void> {
export async function authenticateModelProvider(authPath: string, settingsPath?: string): Promise<boolean> {
const choices = [
"API key (OpenAI, Anthropic, Google, custom provider, ...)",
"OAuth login (ChatGPT Plus/Pro, Claude Pro/Max, Copilot, ...)",
"Cancel",
];
const selection = await promptChoice("How do you want to authenticate?", choices, 0);
if (selection === 0) {
const configured = await configureApiKeyProvider(authPath);
if (configured && settingsPath) {
const currentSpec = getCurrentModelSpec(settingsPath);
const available = getAvailableModelRecords(authPath);
const currentValid = currentSpec ? available.some((m) => `${m.provider}/${m.id}` === currentSpec) : false;
if ((!currentSpec || !currentValid) && available.length > 0) {
const recommended = chooseRecommendedModel(authPath);
if (recommended) {
setDefaultModelSpec(settingsPath, authPath, recommended.spec);
}
}
}
return configured;
}
if (selection === 1) {
return loginModelProvider(authPath, undefined, settingsPath);
}
printInfo("Authentication cancelled.");
return false;
}
export async function loginModelProvider(authPath: string, providerId?: string, settingsPath?: string): Promise<boolean> {
const provider = providerId ? resolveOAuthProvider(authPath, providerId) : await selectOAuthProvider(authPath, "login");
if (!provider) {
if (providerId) {
throw new Error(`Unknown OAuth model provider: ${providerId}`);
}
printInfo("Login cancelled.");
return;
return false;
}
const authStorage = AuthStorage.create(authPath);
@@ -166,6 +651,8 @@ export async function loginModelProvider(authPath: string, providerId?: string,
}
}
}
return true;
}
export async function logoutModelProvider(authPath: string, providerId?: string): Promise<void> {
@@ -200,11 +687,34 @@ export function setDefaultModelSpec(settingsPath: string, authPath: string, spec
export async function runModelSetup(settingsPath: string, authPath: string): Promise<void> {
let status = collectModelStatus(settingsPath, authPath);
if (status.availableModels.length === 0) {
await loginModelProvider(authPath, undefined, settingsPath);
while (status.availableModels.length === 0) {
const choices = [
"API key (OpenAI, Anthropic, ZAI, Kimi, MiniMax, ...)",
"OAuth login (ChatGPT Plus/Pro, Claude Pro/Max, Copilot, ...)",
"Cancel",
];
const selection = await promptChoice("Choose how to configure model access:", choices, 0);
if (selection === 0) {
const configured = await configureApiKeyProvider(authPath);
if (!configured) {
status = collectModelStatus(settingsPath, authPath);
continue;
}
} else if (selection === 1) {
const loggedIn = await loginModelProvider(authPath, undefined, settingsPath);
if (!loggedIn) {
status = collectModelStatus(settingsPath, authPath);
continue;
}
} else {
printInfo("Setup cancelled.");
return;
}
status = collectModelStatus(settingsPath, authPath);
if (status.availableModels.length === 0) {
return;
printWarning("No authenticated models are available yet.");
printInfo("If you configured a custom provider, ensure it has `apiKey` set in models.json.");
printInfo("Tip: run `feynman doctor` to see models.json path + load errors.");
}
}

91
src/model/models-json.ts Normal file
View File

@@ -0,0 +1,91 @@
import { chmodSync, existsSync, mkdirSync, readFileSync, writeFileSync } from "node:fs";
import { dirname } from "node:path";
type ModelsJson = {
providers?: Record<string, Record<string, unknown>>;
};
function readModelsJson(modelsJsonPath: string): { ok: true; value: ModelsJson } | { ok: false; error: string } {
if (!existsSync(modelsJsonPath)) {
return { ok: true, value: { providers: {} } };
}
try {
const raw = readFileSync(modelsJsonPath, "utf8").trim();
if (!raw) {
return { ok: true, value: { providers: {} } };
}
const parsed = JSON.parse(raw) as unknown;
if (!parsed || typeof parsed !== "object") {
return { ok: false, error: `Invalid models.json (expected an object): ${modelsJsonPath}` };
}
return { ok: true, value: parsed as ModelsJson };
} catch (error) {
return {
ok: false,
error: `Failed to read models.json: ${error instanceof Error ? error.message : String(error)}`,
};
}
}
export function upsertProviderBaseUrl(
modelsJsonPath: string,
providerId: string,
baseUrl: string,
): { ok: true } | { ok: false; error: string } {
return upsertProviderConfig(modelsJsonPath, providerId, { baseUrl });
}
export type ProviderConfigPatch = {
baseUrl?: string;
apiKey?: string;
api?: string;
authHeader?: boolean;
headers?: Record<string, string>;
models?: Array<{ id: string }>;
};
export function upsertProviderConfig(
modelsJsonPath: string,
providerId: string,
patch: ProviderConfigPatch,
): { ok: true } | { ok: false; error: string } {
const loaded = readModelsJson(modelsJsonPath);
if (!loaded.ok) {
return loaded;
}
const value: ModelsJson = loaded.value;
const providers: Record<string, Record<string, unknown>> = {
...(value.providers && typeof value.providers === "object" ? value.providers : {}),
};
const currentProvider =
providers[providerId] && typeof providers[providerId] === "object" ? providers[providerId] : {};
const nextProvider: Record<string, unknown> = { ...currentProvider };
if (patch.baseUrl !== undefined) nextProvider.baseUrl = patch.baseUrl;
if (patch.apiKey !== undefined) nextProvider.apiKey = patch.apiKey;
if (patch.api !== undefined) nextProvider.api = patch.api;
if (patch.authHeader !== undefined) nextProvider.authHeader = patch.authHeader;
if (patch.headers !== undefined) nextProvider.headers = patch.headers;
if (patch.models !== undefined) nextProvider.models = patch.models;
providers[providerId] = nextProvider;
const next: ModelsJson = { ...value, providers };
try {
mkdirSync(dirname(modelsJsonPath), { recursive: true });
writeFileSync(modelsJsonPath, JSON.stringify(next, null, 2) + "\n", "utf8");
// models.json can contain API keys/headers; default to user-only permissions.
try {
chmodSync(modelsJsonPath, 0o600);
} catch {
// ignore permission errors (best-effort)
}
return { ok: true };
} catch (error) {
return { ok: false, error: `Failed to write models.json: ${error instanceof Error ? error.message : String(error)}` };
}
}

12
src/model/registry.ts Normal file
View File

@@ -0,0 +1,12 @@
import { dirname, resolve } from "node:path";
import { AuthStorage, ModelRegistry } from "@mariozechner/pi-coding-agent";
export function getModelsJsonPath(authPath: string): string {
return resolve(dirname(authPath), "models.json");
}
export function createModelRegistry(authPath: string): ModelRegistry {
return new ModelRegistry(AuthStorage.create(authPath), getModelsJsonPath(authPath));
}

View File

@@ -2,13 +2,19 @@ import { spawn } from "node:child_process";
import { existsSync } from "node:fs";
import { buildPiArgs, buildPiEnv, type PiRuntimeOptions, resolvePiPaths } from "./runtime.js";
import { ensureSupportedNodeVersion } from "../system/node-version.js";
export async function launchPiChat(options: PiRuntimeOptions): Promise<void> {
const { piCliPath, promisePolyfillPath } = resolvePiPaths(options.appRoot);
ensureSupportedNodeVersion();
const { piCliPath, promisePolyfillPath, promisePolyfillSourcePath, tsxLoaderPath } = resolvePiPaths(options.appRoot);
if (!existsSync(piCliPath)) {
throw new Error(`Pi CLI not found: ${piCliPath}`);
}
if (!existsSync(promisePolyfillPath)) {
const useBuiltPolyfill = existsSync(promisePolyfillPath);
const useDevPolyfill = !useBuiltPolyfill && existsSync(promisePolyfillSourcePath) && existsSync(tsxLoaderPath);
if (!useBuiltPolyfill && !useDevPolyfill) {
throw new Error(`Promise polyfill not found: ${promisePolyfillPath}`);
}
@@ -16,7 +22,11 @@ export async function launchPiChat(options: PiRuntimeOptions): Promise<void> {
process.stdout.write("\x1b[2J\x1b[3J\x1b[H");
}
const child = spawn(process.execPath, ["--import", promisePolyfillPath, piCliPath, ...buildPiArgs(options)], {
const importArgs = useDevPolyfill
? ["--import", tsxLoaderPath, "--import", promisePolyfillSourcePath]
: ["--import", promisePolyfillPath];
const child = spawn(process.execPath, [...importArgs, piCliPath, ...buildPiArgs(options)], {
cwd: options.workingDir,
stdio: "inherit",
env: buildPiEnv(options),
@@ -26,7 +36,11 @@ export async function launchPiChat(options: PiRuntimeOptions): Promise<void> {
child.on("error", reject);
child.on("exit", (code, signal) => {
if (signal) {
process.kill(process.pid, signal);
try {
process.kill(process.pid, signal);
} catch {
process.exitCode = 1;
}
return;
}
process.exitCode = code ?? 0;

View File

@@ -1,6 +1,7 @@
import type { PackageSource } from "@mariozechner/pi-coding-agent";
export const CORE_PACKAGE_SOURCES = [
"npm:@companion-ai/alpha-hub",
"npm:pi-subagents",
"npm:pi-btw",
"npm:pi-docparser",

View File

@@ -1,5 +1,5 @@
import { existsSync, readFileSync } from "node:fs";
import { dirname, resolve } from "node:path";
import { delimiter, dirname, resolve } from "node:path";
import {
BROWSER_FALLBACK_PATHS,
@@ -25,6 +25,8 @@ export function resolvePiPaths(appRoot: string) {
piPackageRoot: resolve(appRoot, "node_modules", "@mariozechner", "pi-coding-agent"),
piCliPath: resolve(appRoot, "node_modules", "@mariozechner", "pi-coding-agent", "dist", "cli.js"),
promisePolyfillPath: resolve(appRoot, "dist", "system", "promise-polyfill.js"),
promisePolyfillSourcePath: resolve(appRoot, "src", "system", "promise-polyfill.ts"),
tsxLoaderPath: resolve(appRoot, "node_modules", "tsx", "dist", "loader.mjs"),
researchToolsPath: resolve(appRoot, "extensions", "research-tools.ts"),
promptTemplatePath: resolve(appRoot, "prompts"),
systemPromptPath: resolve(appRoot, ".feynman", "SYSTEM.md"),
@@ -38,7 +40,11 @@ export function validatePiInstallation(appRoot: string): string[] {
const missing: string[] = [];
if (!existsSync(paths.piCliPath)) missing.push(paths.piCliPath);
if (!existsSync(paths.promisePolyfillPath)) missing.push(paths.promisePolyfillPath);
if (!existsSync(paths.promisePolyfillPath)) {
// Dev fallback: allow running from source without `dist/` build artifacts.
const hasDevPolyfill = existsSync(paths.promisePolyfillSourcePath) && existsSync(paths.tsxLoaderPath);
if (!hasDevPolyfill) missing.push(paths.promisePolyfillPath);
}
if (!existsSync(paths.researchToolsPath)) missing.push(paths.researchToolsPath);
if (!existsSync(paths.promptTemplatePath)) missing.push(paths.promptTemplatePath);
@@ -77,23 +83,35 @@ export function buildPiArgs(options: PiRuntimeOptions): string[] {
export function buildPiEnv(options: PiRuntimeOptions): NodeJS.ProcessEnv {
const paths = resolvePiPaths(options.appRoot);
const feynmanHome = dirname(options.feynmanAgentDir);
const feynmanNpmPrefixPath = resolve(feynmanHome, "npm-global");
const feynmanNpmBinPath = resolve(feynmanNpmPrefixPath, "bin");
const currentPath = process.env.PATH ?? "";
const binPath = paths.nodeModulesBinPath;
const binEntries = [paths.nodeModulesBinPath, resolve(paths.piWorkspaceNodeModulesPath, ".bin"), feynmanNpmBinPath];
const binPath = binEntries.join(delimiter);
return {
...process.env,
PATH: `${binPath}:${currentPath}`,
PATH: `${binPath}${delimiter}${currentPath}`,
FEYNMAN_VERSION: options.feynmanVersion,
FEYNMAN_SESSION_DIR: options.sessionDir,
FEYNMAN_MEMORY_DIR: resolve(dirname(options.feynmanAgentDir), "memory"),
FEYNMAN_NODE_EXECUTABLE: process.execPath,
FEYNMAN_BIN_PATH: resolve(options.appRoot, "bin", "feynman.js"),
FEYNMAN_NPM_PREFIX: feynmanNpmPrefixPath,
// Ensure the Pi child process uses Feynman's agent dir for auth/models/settings.
PI_CODING_AGENT_DIR: options.feynmanAgentDir,
PANDOC_PATH: process.env.PANDOC_PATH ?? resolveExecutable("pandoc", PANDOC_FALLBACK_PATHS),
PI_HARDWARE_CURSOR: process.env.PI_HARDWARE_CURSOR ?? "1",
PI_SKIP_VERSION_CHECK: process.env.PI_SKIP_VERSION_CHECK ?? "1",
MERMAID_CLI_PATH: process.env.MERMAID_CLI_PATH ?? resolveExecutable("mmdc", MERMAID_FALLBACK_PATHS),
PUPPETEER_EXECUTABLE_PATH:
process.env.PUPPETEER_EXECUTABLE_PATH ?? resolveExecutable("google-chrome", BROWSER_FALLBACK_PATHS),
// Always pin npm's global prefix to the Feynman workspace. npm injects
// lowercase config vars into child processes, which would otherwise leak
// the caller's global prefix into Pi.
NPM_CONFIG_PREFIX: feynmanNpmPrefixPath,
npm_config_prefix: feynmanNpmPrefixPath,
};
}

View File

@@ -1,9 +1,10 @@
import { existsSync, mkdirSync, readFileSync, writeFileSync } from "node:fs";
import { dirname } from "node:path";
import { AuthStorage, ModelRegistry, type PackageSource } from "@mariozechner/pi-coding-agent";
import { ModelRegistry, type PackageSource } from "@mariozechner/pi-coding-agent";
import { CORE_PACKAGE_SOURCES, shouldPruneLegacyDefaultPackages } from "./package-presets.js";
import { createModelRegistry } from "../model/registry.js";
export type ThinkingLevel = "off" | "minimal" | "low" | "medium" | "high" | "xhigh";
@@ -115,8 +116,7 @@ export function normalizeFeynmanSettings(
settings.packages = [...CORE_PACKAGE_SOURCES];
}
const authStorage = AuthStorage.create(authPath);
const modelRegistry = new ModelRegistry(authStorage);
const modelRegistry = createModelRegistry(authPath);
const availableModels = modelRegistry.getAvailable().map((model) => ({
provider: model.provider,
id: model.id,

View File

@@ -1,6 +1,7 @@
import { AuthStorage, ModelRegistry } from "@mariozechner/pi-coding-agent";
import { getUserName as getAlphaUserName, isLoggedIn as isAlphaLoggedIn } from "@companion-ai/alpha-hub/lib";
import { readFileSync } from "node:fs";
import { formatPiWebAccessDoctorLines, getPiWebAccessStatus } from "../pi/web-access.js";
import { BROWSER_FALLBACK_PATHS, PANDOC_FALLBACK_PATHS, resolveExecutable } from "../system/executables.js";
import { readJson } from "../pi/settings.js";
@@ -8,6 +9,30 @@ import { validatePiInstallation } from "../pi/runtime.js";
import { printInfo, printPanel, printSection } from "../ui/terminal.js";
import { getCurrentModelSpec } from "../model/commands.js";
import { buildModelStatusSnapshotFromRecords, getAvailableModelRecords, getSupportedModelRecords } from "../model/catalog.js";
import { createModelRegistry, getModelsJsonPath } from "../model/registry.js";
function findProvidersMissingApiKey(modelsJsonPath: string): string[] {
try {
const raw = readFileSync(modelsJsonPath, "utf8").trim();
if (!raw) return [];
const parsed = JSON.parse(raw) as any;
const providers = parsed?.providers;
if (!providers || typeof providers !== "object") return [];
const missing: string[] = [];
for (const [providerId, config] of Object.entries(providers as Record<string, unknown>)) {
if (!config || typeof config !== "object") continue;
const models = (config as any).models;
if (!Array.isArray(models) || models.length === 0) continue;
const apiKey = (config as any).apiKey;
if (typeof apiKey !== "string" || apiKey.trim().length === 0) {
missing.push(providerId);
}
}
return missing;
} catch {
return [];
}
}
export type DoctorOptions = {
settingsPath: string;
@@ -104,7 +129,7 @@ export function runStatus(options: DoctorOptions): void {
export function runDoctor(options: DoctorOptions): void {
const settings = readJson(options.settingsPath);
const modelRegistry = new ModelRegistry(AuthStorage.create(options.authPath));
const modelRegistry = createModelRegistry(options.authPath);
const availableModels = modelRegistry.getAvailable();
const pandocPath = resolveExecutable("pandoc", PANDOC_FALLBACK_PATHS);
const browserPath = process.env.PUPPETEER_EXECUTABLE_PATH ?? resolveExecutable("google-chrome", BROWSER_FALLBACK_PATHS);
@@ -144,6 +169,21 @@ export function runDoctor(options: DoctorOptions): void {
if (modelStatus.recommendedModelReason) {
console.log(` why: ${modelStatus.recommendedModelReason}`);
}
const modelsError = modelRegistry.getError();
if (modelsError) {
console.log("models.json: error");
for (const line of modelsError.split("\n")) {
console.log(` ${line}`);
}
} else {
const modelsJsonPath = getModelsJsonPath(options.authPath);
console.log(`models.json: ${modelsJsonPath}`);
const missingApiKeyProviders = findProvidersMissingApiKey(modelsJsonPath);
if (missingApiKeyProviders.length > 0) {
console.log(` warning: provider(s) missing apiKey: ${missingApiKeyProviders.join(", ")}`);
console.log(" note: custom providers with a models[] list need apiKey in models.json to be available.");
}
}
console.log(`pandoc: ${pandocPath ?? "missing"}`);
console.log(`browser preview runtime: ${browserPath ?? "missing"}`);
for (const line of formatPiWebAccessDoctorLines()) {

View File

@@ -13,13 +13,35 @@ export function setupPreviewDependencies(): PreviewSetupResult {
return { status: "ready", message: `pandoc already installed at ${pandocPath}` };
}
const brewPath = resolveExecutable("brew", BREW_FALLBACK_PATHS);
if (process.platform === "darwin" && brewPath) {
const result = spawnSync(brewPath, ["install", "pandoc"], { stdio: "inherit" });
if (result.status !== 0) {
throw new Error("Failed to install pandoc via Homebrew.");
if (process.platform === "darwin") {
const brewPath = resolveExecutable("brew", BREW_FALLBACK_PATHS);
if (brewPath) {
const result = spawnSync(brewPath, ["install", "pandoc"], { stdio: "inherit" });
if (result.status !== 0) {
throw new Error("Failed to install pandoc via Homebrew.");
}
return { status: "installed", message: "Preview dependency installed: pandoc" };
}
}
if (process.platform === "win32") {
const wingetPath = resolveExecutable("winget");
if (wingetPath) {
const result = spawnSync(wingetPath, ["install", "--id", "JohnMacFarlane.Pandoc", "-e"], { stdio: "inherit" });
if (result.status === 0) {
return { status: "installed", message: "Preview dependency installed: pandoc (via winget)" };
}
}
}
if (process.platform === "linux") {
const aptPath = resolveExecutable("apt-get");
if (aptPath) {
const result = spawnSync(aptPath, ["install", "-y", "pandoc"], { stdio: "inherit" });
if (result.status === 0) {
return { status: "installed", message: "Preview dependency installed: pandoc (via apt)" };
}
}
return { status: "installed", message: "Preview dependency installed: pandoc" };
}
return {

View File

@@ -29,6 +29,7 @@ function printNonInteractiveSetupGuidance(): void {
printInfo("Non-interactive terminal. Use explicit commands:");
printInfo(" feynman model login <provider>");
printInfo(" feynman model set <provider/model>");
printInfo(" # or configure API keys via env vars/auth.json and rerun `feynman model list`");
printInfo(" feynman alpha login");
printInfo(" feynman doctor");
}

View File

@@ -1,27 +1,36 @@
import { spawnSync } from "node:child_process";
import { existsSync } from "node:fs";
export const PANDOC_FALLBACK_PATHS = [
"/opt/homebrew/bin/pandoc",
"/usr/local/bin/pandoc",
];
const isWindows = process.platform === "win32";
const programFiles = process.env.PROGRAMFILES ?? "C:\\Program Files";
const localAppData = process.env.LOCALAPPDATA ?? "";
export const BREW_FALLBACK_PATHS = [
"/opt/homebrew/bin/brew",
"/usr/local/bin/brew",
];
export const PANDOC_FALLBACK_PATHS = isWindows
? [`${programFiles}\\Pandoc\\pandoc.exe`]
: ["/opt/homebrew/bin/pandoc", "/usr/local/bin/pandoc"];
export const BROWSER_FALLBACK_PATHS = [
"/Applications/Google Chrome.app/Contents/MacOS/Google Chrome",
"/Applications/Chromium.app/Contents/MacOS/Chromium",
"/Applications/Brave Browser.app/Contents/MacOS/Brave Browser",
"/Applications/Microsoft Edge.app/Contents/MacOS/Microsoft Edge",
];
export const BREW_FALLBACK_PATHS = isWindows
? []
: ["/opt/homebrew/bin/brew", "/usr/local/bin/brew"];
export const MERMAID_FALLBACK_PATHS = [
"/opt/homebrew/bin/mmdc",
"/usr/local/bin/mmdc",
];
export const BROWSER_FALLBACK_PATHS = isWindows
? [
`${programFiles}\\Google\\Chrome\\Application\\chrome.exe`,
`${programFiles} (x86)\\Google\\Chrome\\Application\\chrome.exe`,
`${localAppData}\\Google\\Chrome\\Application\\chrome.exe`,
`${programFiles}\\Microsoft\\Edge\\Application\\msedge.exe`,
`${programFiles}\\BraveSoftware\\Brave-Browser\\Application\\brave.exe`,
]
: [
"/Applications/Google Chrome.app/Contents/MacOS/Google Chrome",
"/Applications/Chromium.app/Contents/MacOS/Chromium",
"/Applications/Brave Browser.app/Contents/MacOS/Brave Browser",
"/Applications/Microsoft Edge.app/Contents/MacOS/Microsoft Edge",
];
export const MERMAID_FALLBACK_PATHS = isWindows
? []
: ["/opt/homebrew/bin/mmdc", "/usr/local/bin/mmdc"];
export function resolveExecutable(name: string, fallbackPaths: string[] = []): string | undefined {
for (const candidate of fallbackPaths) {
@@ -30,13 +39,19 @@ export function resolveExecutable(name: string, fallbackPaths: string[] = []): s
}
}
const result = spawnSync("sh", ["-lc", `command -v ${name}`], {
encoding: "utf8",
stdio: ["ignore", "pipe", "ignore"],
});
const isWindows = process.platform === "win32";
const result = isWindows
? spawnSync("cmd", ["/c", `where ${name}`], {
encoding: "utf8",
stdio: ["ignore", "pipe", "ignore"],
})
: spawnSync("sh", ["-lc", `command -v ${name}`], {
encoding: "utf8",
stdio: ["ignore", "pipe", "ignore"],
});
if (result.status === 0) {
const resolved = result.stdout.trim();
const resolved = result.stdout.trim().split(/\r?\n/)[0];
if (resolved) {
return resolved;
}

View File

@@ -0,0 +1,45 @@
export const MIN_NODE_VERSION = "20.19.0";
type ParsedNodeVersion = {
major: number;
minor: number;
patch: number;
};
function parseNodeVersion(version: string): ParsedNodeVersion {
const [major = "0", minor = "0", patch = "0"] = version.replace(/^v/, "").split(".");
return {
major: Number.parseInt(major, 10) || 0,
minor: Number.parseInt(minor, 10) || 0,
patch: Number.parseInt(patch, 10) || 0,
};
}
function compareNodeVersions(left: ParsedNodeVersion, right: ParsedNodeVersion): number {
if (left.major !== right.major) return left.major - right.major;
if (left.minor !== right.minor) return left.minor - right.minor;
return left.patch - right.patch;
}
export function isSupportedNodeVersion(version = process.versions.node): boolean {
return compareNodeVersions(parseNodeVersion(version), parseNodeVersion(MIN_NODE_VERSION)) >= 0;
}
export function getUnsupportedNodeVersionLines(version = process.versions.node): string[] {
const isWindows = process.platform === "win32";
return [
`feynman requires Node.js ${MIN_NODE_VERSION} or later (detected ${version}).`,
isWindows
? "Install a newer Node.js from https://nodejs.org, or use the standalone installer:"
: "Switch to Node 20 with `nvm install 20 && nvm use 20`, or use the standalone installer:",
isWindows
? "irm https://feynman.is/install.ps1 | iex"
: "curl -fsSL https://feynman.is/install | bash",
];
}
export function ensureSupportedNodeVersion(version = process.versions.node): void {
if (!isSupportedNodeVersion(version)) {
throw new Error(getUnsupportedNodeVersionLines(version).join("\n"));
}
}

32
tests/models-json.test.ts Normal file
View File

@@ -0,0 +1,32 @@
import test from "node:test";
import assert from "node:assert/strict";
import { mkdtempSync, readFileSync } from "node:fs";
import { tmpdir } from "node:os";
import { join } from "node:path";
import { upsertProviderConfig } from "../src/model/models-json.js";
test("upsertProviderConfig creates models.json and merges provider config", () => {
const dir = mkdtempSync(join(tmpdir(), "feynman-models-"));
const modelsPath = join(dir, "models.json");
const first = upsertProviderConfig(modelsPath, "custom", {
baseUrl: "http://localhost:11434/v1",
apiKey: "ollama",
api: "openai-completions",
authHeader: true,
models: [{ id: "llama3.1:8b" }],
});
assert.deepEqual(first, { ok: true });
const second = upsertProviderConfig(modelsPath, "custom", {
baseUrl: "http://localhost:9999/v1",
});
assert.deepEqual(second, { ok: true });
const parsed = JSON.parse(readFileSync(modelsPath, "utf8")) as any;
assert.equal(parsed.providers.custom.baseUrl, "http://localhost:9999/v1");
assert.equal(parsed.providers.custom.api, "openai-completions");
assert.equal(parsed.providers.custom.authHeader, true);
assert.deepEqual(parsed.providers.custom.models, [{ id: "llama3.1:8b" }]);
});

View File

@@ -0,0 +1,35 @@
import test from "node:test";
import assert from "node:assert/strict";
import {
MIN_NODE_VERSION,
ensureSupportedNodeVersion,
getUnsupportedNodeVersionLines,
isSupportedNodeVersion,
} from "../src/system/node-version.js";
test("isSupportedNodeVersion enforces the exact minimum floor", () => {
assert.equal(isSupportedNodeVersion("20.19.0"), true);
assert.equal(isSupportedNodeVersion("20.19.0"), true);
assert.equal(isSupportedNodeVersion("21.0.0"), true);
assert.equal(isSupportedNodeVersion("20.18.1"), false);
assert.equal(isSupportedNodeVersion("18.17.0"), false);
});
test("ensureSupportedNodeVersion throws a guided upgrade message", () => {
assert.throws(
() => ensureSupportedNodeVersion("18.17.0"),
(error: unknown) =>
error instanceof Error &&
error.message.includes(`Node.js ${MIN_NODE_VERSION}`) &&
error.message.includes("nvm install 20 && nvm use 20") &&
error.message.includes("https://feynman.is/install"),
);
});
test("unsupported version guidance reports the detected version", () => {
const lines = getUnsupportedNodeVersionLines("18.17.0");
assert.equal(lines[0], "feynman requires Node.js 20.19.0 or later (detected 18.17.0).");
assert.ok(lines.some((line) => line.includes("curl -fsSL https://feynman.is/install | bash")));
});

View File

@@ -30,6 +30,11 @@ test("buildPiArgs includes configured runtime paths and prompt", () => {
});
test("buildPiEnv wires Feynman paths into the Pi environment", () => {
const previousUppercasePrefix = process.env.NPM_CONFIG_PREFIX;
const previousLowercasePrefix = process.env.npm_config_prefix;
process.env.NPM_CONFIG_PREFIX = "/tmp/global-prefix";
process.env.npm_config_prefix = "/tmp/global-prefix-lower";
const env = buildPiEnv({
appRoot: "/repo/feynman",
workingDir: "/workspace",
@@ -38,9 +43,31 @@ test("buildPiEnv wires Feynman paths into the Pi environment", () => {
feynmanVersion: "0.1.5",
});
assert.equal(env.FEYNMAN_SESSION_DIR, "/sessions");
assert.equal(env.FEYNMAN_BIN_PATH, "/repo/feynman/bin/feynman.js");
assert.equal(env.FEYNMAN_MEMORY_DIR, "/home/.feynman/memory");
try {
assert.equal(env.FEYNMAN_SESSION_DIR, "/sessions");
assert.equal(env.FEYNMAN_BIN_PATH, "/repo/feynman/bin/feynman.js");
assert.equal(env.FEYNMAN_MEMORY_DIR, "/home/.feynman/memory");
assert.equal(env.FEYNMAN_NPM_PREFIX, "/home/.feynman/npm-global");
assert.equal(env.NPM_CONFIG_PREFIX, "/home/.feynman/npm-global");
assert.equal(env.npm_config_prefix, "/home/.feynman/npm-global");
assert.equal(env.PI_CODING_AGENT_DIR, "/home/.feynman/agent");
assert.ok(
env.PATH?.startsWith(
"/repo/feynman/node_modules/.bin:/repo/feynman/.feynman/npm/node_modules/.bin:/home/.feynman/npm-global/bin:",
),
);
} finally {
if (previousUppercasePrefix === undefined) {
delete process.env.NPM_CONFIG_PREFIX;
} else {
process.env.NPM_CONFIG_PREFIX = previousUppercasePrefix;
}
if (previousLowercasePrefix === undefined) {
delete process.env.npm_config_prefix;
} else {
process.env.npm_config_prefix = previousLowercasePrefix;
}
}
});
test("resolvePiPaths includes the Promise.withResolvers polyfill path", () => {

File diff suppressed because one or more lines are too long

View File

@@ -7,6 +7,9 @@
"": {
"name": "website",
"version": "0.0.1",
"engines": {
"node": ">=20.19.0"
},
"dependencies": {
"@astrojs/react": "^4.4.2",
"@fontsource-variable/ibm-plex-sans": "^5.2.8",

View File

@@ -3,6 +3,9 @@
"type": "module",
"version": "0.0.1",
"private": true,
"engines": {
"node": ">=20.19.0"
},
"scripts": {
"dev": "astro dev",
"build": "node ../scripts/sync-website-installers.mjs && astro build",

View File

@@ -2,7 +2,7 @@
set -eu
VERSION="${1:-edge}"
VERSION="${1:-latest}"
INSTALL_BIN_DIR="${FEYNMAN_INSTALL_BIN_DIR:-$HOME/.local/bin}"
INSTALL_APP_DIR="${FEYNMAN_INSTALL_APP_DIR:-$HOME/.local/share/feynman}"
SKIP_PATH_UPDATE="${FEYNMAN_INSTALL_SKIP_PATH_UPDATE:-0}"
@@ -54,12 +54,16 @@ run_with_spinner() {
normalize_version() {
case "$1" in
"" | edge)
printf 'edge\n'
"")
printf 'latest\n'
;;
latest | stable)
printf 'latest\n'
;;
edge)
echo "The edge channel has been removed. Use the default installer for the latest tagged release or pass an exact version." >&2
exit 1
;;
v*)
printf '%s\n' "${1#v}"
;;
@@ -160,39 +164,33 @@ require_command() {
fi
}
resolve_release_metadata() {
normalized_version="$(normalize_version "$VERSION")"
warn_command_conflict() {
expected_path="$INSTALL_BIN_DIR/feynman"
resolved_path="$(command -v feynman 2>/dev/null || true)"
if [ "$normalized_version" = "edge" ]; then
release_json="$(download_text "https://api.github.com/repos/getcompanion-ai/feynman/releases/tags/edge")"
asset_url=""
for candidate in $(printf '%s\n' "$release_json" | sed -n 's/.*"browser_download_url":[[:space:]]*"\([^"]*\)".*/\1/p'); do
case "$candidate" in
*/feynman-*-${asset_target}.${archive_extension})
asset_url="$candidate"
break
;;
esac
done
if [ -z "$asset_url" ]; then
echo "Failed to resolve the latest Feynman edge bundle." >&2
exit 1
fi
archive_name="${asset_url##*/}"
bundle_name="${archive_name%.$archive_extension}"
resolved_version="${bundle_name#feynman-}"
resolved_version="${resolved_version%-${asset_target}}"
printf '%s\n%s\n%s\n%s\n' "$resolved_version" "$bundle_name" "$archive_name" "$asset_url"
if [ -z "$resolved_path" ]; then
return
fi
if [ "$resolved_path" != "$expected_path" ]; then
step "Warning: current shell resolves feynman to $resolved_path"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && hash -r && feynman"
step "Or launch directly: $expected_path"
case "$resolved_path" in
*"/node_modules/@companion-ai/feynman/"* | *"/node_modules/.bin/feynman")
step "If that path is an old global npm install, remove it with: npm uninstall -g @companion-ai/feynman"
;;
esac
fi
}
resolve_release_metadata() {
normalized_version="$(normalize_version "$VERSION")"
if [ "$normalized_version" = "latest" ]; then
release_json="$(download_text "https://api.github.com/repos/getcompanion-ai/feynman/releases/latest")"
resolved_version="$(printf '%s\n' "$release_json" | sed -n 's/.*"tag_name":[[:space:]]*"v\([^"]*\)".*/\1/p' | head -n 1)"
release_page="$(download_text "https://github.com/getcompanion-ai/feynman/releases/latest")"
resolved_version="$(printf '%s\n' "$release_page" | sed -n 's@.*releases/tag/v\([0-9][^"<>[:space:]]*\).*@\1@p' | head -n 1)"
if [ -z "$resolved_version" ]; then
echo "Failed to resolve the latest Feynman release version." >&2
@@ -290,20 +288,22 @@ add_to_path
case "$path_action" in
added)
step "PATH updated for future shells in $path_profile"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && feynman"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && hash -r && feynman"
;;
configured)
step "PATH is already configured for future shells in $path_profile"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && feynman"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && hash -r && feynman"
;;
skipped)
step "PATH update skipped"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && feynman"
step "Run now: export PATH=\"$INSTALL_BIN_DIR:\$PATH\" && hash -r && feynman"
;;
*)
step "$INSTALL_BIN_DIR is already on PATH"
step "Run: feynman"
step "Run: hash -r && feynman"
;;
esac
warn_command_conflict
printf 'Feynman %s installed successfully.\n' "$resolved_version"

View File

@@ -0,0 +1,204 @@
#!/bin/sh
set -eu
VERSION="latest"
SCOPE="${FEYNMAN_SKILLS_SCOPE:-user}"
TARGET_DIR="${FEYNMAN_SKILLS_DIR:-}"
step() {
printf '==> %s\n' "$1"
}
normalize_version() {
case "$1" in
"")
printf 'latest\n'
;;
latest | stable)
printf 'latest\n'
;;
edge)
echo "The edge channel has been removed. Use the default installer for the latest tagged release or pass an exact version." >&2
exit 1
;;
v*)
printf '%s\n' "${1#v}"
;;
*)
printf '%s\n' "$1"
;;
esac
}
download_file() {
url="$1"
output="$2"
if command -v curl >/dev/null 2>&1; then
if [ -t 2 ]; then
curl -fL --progress-bar "$url" -o "$output"
else
curl -fsSL "$url" -o "$output"
fi
return
fi
if command -v wget >/dev/null 2>&1; then
if [ -t 2 ]; then
wget --show-progress -O "$output" "$url"
else
wget -q -O "$output" "$url"
fi
return
fi
echo "curl or wget is required to install Feynman skills." >&2
exit 1
}
download_text() {
url="$1"
if command -v curl >/dev/null 2>&1; then
curl -fsSL "$url"
return
fi
if command -v wget >/dev/null 2>&1; then
wget -q -O - "$url"
return
fi
echo "curl or wget is required to install Feynman skills." >&2
exit 1
}
resolve_version() {
normalized_version="$(normalize_version "$VERSION")"
if [ "$normalized_version" = "latest" ]; then
release_page="$(download_text "https://github.com/getcompanion-ai/feynman/releases/latest")"
resolved_version="$(printf '%s\n' "$release_page" | sed -n 's@.*releases/tag/v\([0-9][^"<>[:space:]]*\).*@\1@p' | head -n 1)"
if [ -z "$resolved_version" ]; then
echo "Failed to resolve the latest Feynman release version." >&2
exit 1
fi
printf '%s\nv%s\n' "$resolved_version" "$resolved_version"
return
fi
printf '%s\nv%s\n' "$normalized_version" "$normalized_version"
}
resolve_target_dir() {
if [ -n "$TARGET_DIR" ]; then
printf '%s\n' "$TARGET_DIR"
return
fi
case "$SCOPE" in
repo)
printf '%s/.agents/skills/feynman\n' "$PWD"
;;
user)
codex_home="${CODEX_HOME:-$HOME/.codex}"
printf '%s/skills/feynman\n' "$codex_home"
;;
*)
echo "Unknown scope: $SCOPE (expected --user or --repo)" >&2
exit 1
;;
esac
}
while [ $# -gt 0 ]; do
case "$1" in
--repo)
SCOPE="repo"
;;
--user)
SCOPE="user"
;;
--dir)
if [ $# -lt 2 ]; then
echo "Usage: install-skills.sh [stable|latest|<version>] [--user|--repo] [--dir <path>]" >&2
exit 1
fi
TARGET_DIR="$2"
shift
;;
edge|stable|latest|v*|[0-9]*)
VERSION="$1"
;;
*)
echo "Unknown argument: $1" >&2
echo "Usage: install-skills.sh [stable|latest|<version>] [--user|--repo] [--dir <path>]" >&2
exit 1
;;
esac
shift
done
archive_metadata="$(resolve_version)"
resolved_version="$(printf '%s\n' "$archive_metadata" | sed -n '1p')"
git_ref="$(printf '%s\n' "$archive_metadata" | sed -n '2p')"
archive_url=""
case "$git_ref" in
main)
archive_url="https://github.com/getcompanion-ai/feynman/archive/refs/heads/main.tar.gz"
;;
v*)
archive_url="https://github.com/getcompanion-ai/feynman/archive/refs/tags/${git_ref}.tar.gz"
;;
esac
if [ -z "$archive_url" ]; then
echo "Could not resolve a download URL for ref: $git_ref" >&2
exit 1
fi
install_dir="$(resolve_target_dir)"
step "Installing Feynman skills ${resolved_version} (${SCOPE})"
tmp_dir="$(mktemp -d)"
cleanup() {
rm -rf "$tmp_dir"
}
trap cleanup EXIT INT TERM
archive_path="$tmp_dir/feynman-skills.tar.gz"
step "Downloading skills archive"
download_file "$archive_url" "$archive_path"
extract_dir="$tmp_dir/extract"
mkdir -p "$extract_dir"
step "Extracting skills"
tar -xzf "$archive_path" -C "$extract_dir"
source_root="$(find "$extract_dir" -mindepth 1 -maxdepth 1 -type d | head -n 1)"
if [ -z "$source_root" ] || [ ! -d "$source_root/skills" ]; then
echo "Could not find skills/ in downloaded archive." >&2
exit 1
fi
mkdir -p "$(dirname "$install_dir")"
rm -rf "$install_dir"
mkdir -p "$install_dir"
cp -R "$source_root/skills/." "$install_dir/"
step "Installed skills to $install_dir"
case "$SCOPE" in
repo)
step "Repo-local skills will be discovered automatically from .agents/skills"
;;
user)
step "User-level skills will be discovered from \$CODEX_HOME/skills"
;;
esac
printf 'Feynman skills %s installed successfully.\n' "$resolved_version"

View File

@@ -0,0 +1,123 @@
param(
[string]$Version = "latest",
[ValidateSet("User", "Repo")]
[string]$Scope = "User",
[string]$TargetDir = ""
)
$ErrorActionPreference = "Stop"
function Normalize-Version {
param([string]$RequestedVersion)
if (-not $RequestedVersion) {
return "latest"
}
switch ($RequestedVersion.ToLowerInvariant()) {
"latest" { return "latest" }
"stable" { return "latest" }
"edge" { throw "The edge channel has been removed. Use the default installer for the latest tagged release or pass an exact version." }
default { return $RequestedVersion.TrimStart("v") }
}
}
function Resolve-LatestReleaseVersion {
$page = Invoke-WebRequest -Uri "https://github.com/getcompanion-ai/feynman/releases/latest"
$match = [regex]::Match($page.Content, 'releases/tag/v([0-9][^"''<>\s]*)')
if (-not $match.Success) {
throw "Failed to resolve the latest Feynman release version."
}
return $match.Groups[1].Value
}
function Resolve-VersionMetadata {
param([string]$RequestedVersion)
$normalizedVersion = Normalize-Version -RequestedVersion $RequestedVersion
if ($normalizedVersion -eq "latest") {
$resolvedVersion = Resolve-LatestReleaseVersion
} else {
$resolvedVersion = $normalizedVersion
}
return [PSCustomObject]@{
ResolvedVersion = $resolvedVersion
GitRef = "v$resolvedVersion"
DownloadUrl = "https://github.com/getcompanion-ai/feynman/archive/refs/tags/v$resolvedVersion.zip"
}
}
function Resolve-InstallDir {
param(
[string]$ResolvedScope,
[string]$ResolvedTargetDir
)
if ($ResolvedTargetDir) {
return $ResolvedTargetDir
}
if ($ResolvedScope -eq "Repo") {
return Join-Path (Get-Location) ".agents\skills\feynman"
}
$codexHome = if ($env:CODEX_HOME) { $env:CODEX_HOME } else { Join-Path $HOME ".codex" }
return Join-Path $codexHome "skills\feynman"
}
$metadata = Resolve-VersionMetadata -RequestedVersion $Version
$resolvedVersion = $metadata.ResolvedVersion
$downloadUrl = $metadata.DownloadUrl
$installDir = Resolve-InstallDir -ResolvedScope $Scope -ResolvedTargetDir $TargetDir
$tmpDir = Join-Path ([System.IO.Path]::GetTempPath()) ("feynman-skills-install-" + [System.Guid]::NewGuid().ToString("N"))
New-Item -ItemType Directory -Path $tmpDir | Out-Null
try {
$archivePath = Join-Path $tmpDir "feynman-skills.zip"
$extractDir = Join-Path $tmpDir "extract"
Write-Host "==> Downloading Feynman skills $resolvedVersion"
Invoke-WebRequest -Uri $downloadUrl -OutFile $archivePath
Write-Host "==> Extracting skills"
Expand-Archive -LiteralPath $archivePath -DestinationPath $extractDir -Force
$sourceRoot = Get-ChildItem -Path $extractDir -Directory | Select-Object -First 1
if (-not $sourceRoot) {
throw "Could not find extracted Feynman archive."
}
$skillsSource = Join-Path $sourceRoot.FullName "skills"
if (-not (Test-Path $skillsSource)) {
throw "Could not find skills/ in downloaded archive."
}
$installParent = Split-Path $installDir -Parent
if ($installParent) {
New-Item -ItemType Directory -Path $installParent -Force | Out-Null
}
if (Test-Path $installDir) {
Remove-Item -Recurse -Force $installDir
}
New-Item -ItemType Directory -Path $installDir -Force | Out-Null
Copy-Item -Path (Join-Path $skillsSource "*") -Destination $installDir -Recurse -Force
Write-Host "==> Installed skills to $installDir"
if ($Scope -eq "Repo") {
Write-Host "Repo-local skills will be discovered automatically from .agents/skills."
} else {
Write-Host "User-level skills will be discovered from `$CODEX_HOME/skills."
}
Write-Host "Feynman skills $resolvedVersion installed successfully."
} finally {
if (Test-Path $tmpDir) {
Remove-Item -Recurse -Force $tmpDir
}
}

View File

@@ -1,5 +1,5 @@
param(
[string]$Version = "edge"
[string]$Version = "latest"
)
$ErrorActionPreference = "Stop"
@@ -8,17 +8,27 @@ function Normalize-Version {
param([string]$RequestedVersion)
if (-not $RequestedVersion) {
return "edge"
return "latest"
}
switch ($RequestedVersion.ToLowerInvariant()) {
"edge" { return "edge" }
"latest" { return "latest" }
"stable" { return "latest" }
"edge" { throw "The edge channel has been removed. Use the default installer for the latest tagged release or pass an exact version." }
default { return $RequestedVersion.TrimStart("v") }
}
}
function Resolve-LatestReleaseVersion {
$page = Invoke-WebRequest -Uri "https://github.com/getcompanion-ai/feynman/releases/latest"
$match = [regex]::Match($page.Content, 'releases/tag/v([0-9][^"''<>\s]*)')
if (-not $match.Success) {
throw "Failed to resolve the latest Feynman release version."
}
return $match.Groups[1].Value
}
function Resolve-ReleaseMetadata {
param(
[string]$RequestedVersion,
@@ -28,34 +38,8 @@ function Resolve-ReleaseMetadata {
$normalizedVersion = Normalize-Version -RequestedVersion $RequestedVersion
if ($normalizedVersion -eq "edge") {
$release = Invoke-RestMethod -Uri "https://api.github.com/repos/getcompanion-ai/feynman/releases/tags/edge"
$asset = $release.assets | Where-Object { $_.name -like "feynman-*-$AssetTarget.$BundleExtension" } | Select-Object -First 1
if (-not $asset) {
throw "Failed to resolve the latest Feynman edge bundle."
}
$archiveName = $asset.name
$suffix = ".$BundleExtension"
$bundleName = $archiveName.Substring(0, $archiveName.Length - $suffix.Length)
$resolvedVersion = $bundleName.Substring("feynman-".Length)
$resolvedVersion = $resolvedVersion.Substring(0, $resolvedVersion.Length - ("-$AssetTarget").Length)
return [PSCustomObject]@{
ResolvedVersion = $resolvedVersion
BundleName = $bundleName
ArchiveName = $archiveName
DownloadUrl = $asset.browser_download_url
}
}
if ($normalizedVersion -eq "latest") {
$release = Invoke-RestMethod -Uri "https://api.github.com/repos/getcompanion-ai/feynman/releases/latest"
if (-not $release.tag_name) {
throw "Failed to resolve the latest Feynman release version."
}
$resolvedVersion = $release.tag_name.TrimStart("v")
$resolvedVersion = Resolve-LatestReleaseVersion
} else {
$resolvedVersion = $normalizedVersion
}
@@ -73,12 +57,26 @@ function Resolve-ReleaseMetadata {
}
function Get-ArchSuffix {
$arch = [System.Runtime.InteropServices.RuntimeInformation]::OSArchitecture
switch ($arch.ToString()) {
"X64" { return "x64" }
"Arm64" { return "arm64" }
default { throw "Unsupported architecture: $arch" }
# Prefer PROCESSOR_ARCHITECTURE which is always available on Windows.
# RuntimeInformation::OSArchitecture requires .NET 4.7.1+ and may not
# be loaded in every Windows PowerShell 5.1 session.
$envArch = $env:PROCESSOR_ARCHITECTURE
if ($envArch) {
switch ($envArch) {
"AMD64" { return "x64" }
"ARM64" { return "arm64" }
}
}
try {
$arch = [System.Runtime.InteropServices.RuntimeInformation]::OSArchitecture
switch ($arch.ToString()) {
"X64" { return "x64" }
"Arm64" { return "arm64" }
}
} catch {}
throw "Unsupported architecture: $envArch"
}
$archSuffix = Get-ArchSuffix
@@ -134,7 +132,11 @@ Workarounds:
"@ | Set-Content -Path $shimPath -Encoding ASCII
$currentUserPath = [Environment]::GetEnvironmentVariable("Path", "User")
if (-not $currentUserPath.Split(';').Contains($installBinDir)) {
$alreadyOnPath = $false
if ($currentUserPath) {
$alreadyOnPath = $currentUserPath.Split(';') -contains $installBinDir
}
if (-not $alreadyOnPath) {
$updatedPath = if ([string]::IsNullOrWhiteSpace($currentUserPath)) {
$installBinDir
} else {
@@ -146,6 +148,16 @@ Workarounds:
Write-Host "$installBinDir is already on PATH."
}
$resolvedCommand = Get-Command feynman -ErrorAction SilentlyContinue
if ($resolvedCommand -and $resolvedCommand.Source -ne $shimPath) {
Write-Warning "Current shell resolves feynman to $($resolvedCommand.Source)"
Write-Host "Run in a new shell, or run: `$env:Path = '$installBinDir;' + `$env:Path"
Write-Host "Then run: feynman"
if ($resolvedCommand.Source -like "*node_modules*@companion-ai*feynman*") {
Write-Host "If that path is an old global npm install, remove it with: npm uninstall -g @companion-ai/feynman"
}
}
Write-Host "Feynman $resolvedVersion installed successfully."
} finally {
if (Test-Path $tmpDir) {

View File

@@ -17,7 +17,7 @@ curl -fsSL https://feynman.is/install | bash
The installer detects your OS and architecture automatically. On macOS it supports both Intel and Apple Silicon. On Linux it supports x64 and arm64. The launcher is installed to `~/.local/bin`, the bundled runtime is unpacked into `~/.local/share/feynman`, and your `PATH` is updated when needed.
By default, the one-line installer tracks the rolling `edge` channel from `main`.
If you previously installed Feynman via `npm`, `pnpm`, or `bun` and still see local Node.js errors after a curl install, your shell is probably still resolving the older global binary first. Run `which -a feynman`, then `hash -r`, or launch the standalone shim directly with `~/.local/bin/feynman`.
On **Windows**, open PowerShell as Administrator and run:
@@ -27,25 +27,53 @@ irm https://feynman.is/install.ps1 | iex
This installs the Windows runtime bundle under `%LOCALAPPDATA%\Programs\feynman`, adds its launcher to your user `PATH`, and lets you re-run the installer at any time to update.
## Stable or pinned releases
## Skills only
If you want the latest tagged release instead of the rolling `edge` channel:
If you only want Feynman's research skills and not the full terminal runtime, install the skill library separately.
For a user-level install into `~/.codex/skills/feynman`:
```bash
curl -fsSL https://feynman.is/install | bash -s -- stable
curl -fsSL https://feynman.is/install-skills | bash
```
For a repo-local install into `.agents/skills/feynman` under the current repository:
```bash
curl -fsSL https://feynman.is/install-skills | bash -s -- --repo
```
On Windows, install the skills into your Codex skill directory:
```powershell
irm https://feynman.is/install-skills.ps1 | iex
```
Or install them repo-locally:
```powershell
& ([scriptblock]::Create((irm https://feynman.is/install-skills.ps1))) -Scope Repo
```
These installers download only the `skills/` tree from the Feynman repository. They do not install the Feynman terminal, bundled Node runtime, auth storage, or Pi packages.
## Pinned releases
The one-line installer already targets the latest tagged release. To pin an exact version, pass it explicitly:
```bash
curl -fsSL https://feynman.is/install | bash -s -- 0.2.15
```
On Windows:
```powershell
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version stable
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.15
```
You can also pin an exact version by replacing `stable` with a version such as `0.2.14`.
## pnpm
If you already have Node.js 20.18.1+ installed, you can install Feynman globally via `pnpm`:
If you already have Node.js `20.19.0` or newer installed, you can install Feynman globally via `pnpm`:
```bash
pnpm add -g @companion-ai/feynman
@@ -59,6 +87,8 @@ pnpm dlx @companion-ai/feynman
## bun
`bun add -g` and `bunx` still use your local Node runtime for Feynman itself, so the same Node.js `20.19.0+` requirement applies.
```bash
bun add -g @companion-ai/feynman
```
@@ -98,6 +128,7 @@ For contributing or running Feynman from source:
```bash
git clone https://github.com/getcompanion-ai/feynman.git
cd feynman
pnpm install
pnpm start
nvm use || nvm install
npm install
npm start
```

View File

@@ -39,11 +39,11 @@ const {
</head>
<body class="flex min-h-screen flex-col bg-background text-foreground antialiased">
<nav class="sticky top-0 z-50 bg-background">
<div class="mx-auto flex h-14 max-w-6xl items-center justify-between px-6">
<div class="mx-auto flex h-14 max-w-6xl items-center justify-between gap-4 px-4 sm:px-6">
<a href="/" class="flex items-center gap-2">
<span class="font-['VT323'] text-2xl text-primary">feynman</span>
</a>
<div class="flex items-center gap-6">
<div class="flex items-center gap-4 sm:gap-6">
<a
href="/docs/getting-started/installation"
class:list={[
@@ -98,7 +98,7 @@ const {
</main>
<footer>
<div class="mx-auto flex max-w-6xl flex-col items-center justify-between gap-4 px-6 py-8 sm:flex-row">
<div class="mx-auto flex max-w-6xl flex-col items-center justify-between gap-4 px-4 py-8 sm:flex-row sm:px-6">
<p class="text-sm text-muted-foreground">
&copy; {new Date().getFullYear()} Companion, Inc.
</p>

View File

@@ -64,12 +64,12 @@ const installCommands = [
</div>
<div class="flex w-full max-w-3xl flex-col items-center gap-4">
<div class="flex w-full flex-col">
<div class="flex self-start">
<div class="flex w-full max-w-full flex-col">
<div class="flex max-w-full self-start overflow-x-auto">
{installCommands.map((entry, i) => (
<button
class:list={[
"install-toggle px-4 py-2 text-sm font-medium transition-colors cursor-pointer",
"install-toggle shrink-0 px-4 py-2 text-sm font-medium transition-colors cursor-pointer",
i === 0 ? "rounded-tl-lg" : "",
i === installCommands.length - 1 ? "rounded-tr-lg" : "",
entry.label === installCommands[0].label
@@ -85,17 +85,17 @@ const installCommands = [
</div>
<button
id="install-cmd"
class="group flex w-full items-center justify-between gap-3 rounded-b-lg rounded-tr-lg bg-muted px-4 py-3 text-left font-mono text-sm transition-colors hover:bg-muted/80 cursor-pointer"
class="group flex w-full min-w-0 max-w-full items-start justify-between gap-3 rounded-b-lg rounded-tr-lg bg-muted px-4 py-3 text-left font-mono text-xs transition-colors hover:bg-muted/80 cursor-pointer sm:items-center sm:text-sm"
data-command={installCommands[0].command}
aria-label="Copy install command"
>
<span id="install-command" class="min-w-0 truncate">{installCommands[0].command}</span>
<svg id="install-copy" class="size-4 shrink-0 text-muted-foreground transition-colors group-hover:text-foreground" fill="none" viewBox="0 0 24 24" stroke="currentColor" stroke-width="2"><rect x="9" y="9" width="13" height="13" rx="2"/><path d="M5 15H4a2 2 0 0 1-2-2V4a2 2 0 0 1 2-2h9a2 2 0 0 1 2 2v1"/></svg>
<svg id="install-check" class="hidden size-4 shrink-0 text-primary" fill="none" viewBox="0 0 24 24" stroke="currentColor" stroke-width="2"><path d="M20 6L9 17l-5-5"/></svg>
<span id="install-command" class="flex-1 min-w-0 break-all whitespace-normal leading-relaxed sm:overflow-hidden sm:text-ellipsis sm:whitespace-nowrap">{installCommands[0].command}</span>
<svg id="install-copy" class="mt-0.5 size-4 shrink-0 text-muted-foreground transition-colors group-hover:text-foreground sm:mt-0" fill="none" viewBox="0 0 24 24" stroke="currentColor" stroke-width="2"><rect x="9" y="9" width="13" height="13" rx="2"/><path d="M5 15H4a2 2 0 0 1-2-2V4a2 2 0 0 1 2-2h9a2 2 0 0 1 2 2v1"/></svg>
<svg id="install-check" class="mt-0.5 hidden size-4 shrink-0 text-primary sm:mt-0" fill="none" viewBox="0 0 24 24" stroke="currentColor" stroke-width="2"><path d="M20 6L9 17l-5-5"/></svg>
</button>
</div>
<div class="flex items-center gap-3">
<div class="flex flex-wrap items-center justify-center gap-3">
<a href="/docs/getting-started/installation">
<Button client:load size="lg">Get Started</Button>
</a>
@@ -103,6 +103,10 @@ const installCommands = [
GitHub
</a>
</div>
<p class="text-sm text-muted-foreground">
Need just the skills? <a href="/docs/getting-started/installation" class="text-primary hover:underline">Install the skills-only bundle</a>.
</p>
</div>
<img src="/hero.png" class="w-full" alt="Feynman CLI" />
@@ -221,7 +225,7 @@ const installCommands = [
<p class="text-muted-foreground">
Built on <a href="https://github.com/badlogic/pi-mono" class="text-primary hover:underline">Pi</a> and <a href="https://www.alphaxiv.org/" class="text-primary hover:underline">alphaXiv</a>. Capabilities ship as Pi skills and every output stays source-grounded.
</p>
<div class="flex items-center gap-3">
<div class="flex flex-wrap items-center justify-center gap-3">
<a href="/docs/getting-started/installation">
<Button client:load size="lg">Get Started</Button>
</a>

View File

@@ -123,10 +123,12 @@
}
body {
@apply bg-background text-foreground;
overflow-x: clip;
}
html {
@apply font-sans;
scroll-behavior: smooth;
overflow-x: clip;
}
}
@@ -366,4 +368,4 @@
::selection {
background: var(--primary);
color: var(--primary-foreground);
}
}