Compare commits
3 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
aed607ce62 | ||
|
|
ab8a284c74 | ||
|
|
62d63be1d8 |
18
CHANGELOG.md
18
CHANGELOG.md
@@ -77,3 +77,21 @@ Use this file to track chronology, not release notes. Keep entries short, factua
|
||||
- Failed / learned: The open subagent issue is fixed on `main` but still user-visible on tagged installs until a fresh release is cut.
|
||||
- Blockers: Need the GitHub publish workflow to finish successfully before the issue can be honestly closed as released.
|
||||
- Next: Push `0.2.15`, monitor the publish workflow, then update and close the relevant GitHub issue/PR once the release is live.
|
||||
|
||||
### 2026-03-28 15:15 PDT — pi-subagents-agent-dir-compat
|
||||
|
||||
- Objective: Debug why tagged installs can still fail subagent/auth flows after `0.2.15` when users are not on Anthropic.
|
||||
- Changed: Added `scripts/lib/pi-subagents-patch.mjs` plus type declarations and wired `scripts/patch-embedded-pi.mjs` to rewrite vendored `pi-subagents` runtime files so they resolve user-scoped paths from `PI_CODING_AGENT_DIR` instead of hardcoded `~/.pi/agent`; added `tests/pi-subagents-patch.test.ts`.
|
||||
- Verified: Materialized `.feynman/npm`, inspected the shipped `pi-subagents@0.11.11` sources, confirmed the hardcoded `~/.pi/agent` paths in `index.ts`, `agents.ts`, `artifacts.ts`, `run-history.ts`, `skills.ts`, and `chain-clarify.ts`; ran `node scripts/patch-embedded-pi.mjs`; ran `npm test`, `npm run typecheck`, and `npm run build`.
|
||||
- Failed / learned: The earlier `0.2.15` fix only proved that Feynman exported `PI_CODING_AGENT_DIR` to the top-level Pi child; it did not cover vendored extension code that still hardcoded `.pi` paths internally.
|
||||
- Blockers: Users still need a release containing this patch before tagged installs benefit from it.
|
||||
- Next: Cut the next release and verify a tagged install exercises subagents without reading from `~/.pi/agent`.
|
||||
|
||||
### 2026-03-28 21:46 PDT — release-0.2.16
|
||||
|
||||
- Objective: Ship the vendored `pi-subagents` agent-dir compatibility fix to tagged installs.
|
||||
- Changed: Bumped the package version from `0.2.15` to `0.2.16` in `package.json` and `package-lock.json`; updated pinned installer examples in `README.md` and `website/src/content/docs/getting-started/installation.md`.
|
||||
- Verified: Re-ran `npm test`, `npm run typecheck`, and `npm run build`; ran `cd website && npm run build`; ran `npm pack` and confirmed the `0.2.16` tarball includes the new `scripts/lib/pi-subagents-patch.*` files.
|
||||
- Failed / learned: An initial local `build:native-bundle` check failed because `npm pack` and `build:native-bundle` were run in parallel, and `prepack` intentionally removes `dist/release`; rerunning `npm run build:native-bundle` sequentially succeeded.
|
||||
- Blockers: None in the repo; publishing still depends on the GitHub workflow running on the bumped version.
|
||||
- Next: Push the `0.2.16` release bump and monitor npm/GitHub release publication.
|
||||
|
||||
@@ -25,7 +25,7 @@ curl -fsSL https://feynman.is/install | bash
|
||||
irm https://feynman.is/install.ps1 | iex
|
||||
```
|
||||
|
||||
The one-line installer fetches the latest tagged release. To pin a version, pass it explicitly, for example `curl -fsSL https://feynman.is/install | bash -s -- 0.2.15`.
|
||||
The one-line installer fetches the latest tagged release. To pin a version, pass it explicitly, for example `curl -fsSL https://feynman.is/install | bash -s -- 0.2.16`.
|
||||
|
||||
If you install via `pnpm` or `bun` instead of the standalone bundle, Feynman requires Node.js `20.19.0` or newer.
|
||||
|
||||
@@ -82,9 +82,6 @@ $ feynman audit 2401.12345
|
||||
|
||||
$ feynman replicate "chain-of-thought improves math"
|
||||
→ Replicates experiments on local or cloud GPUs
|
||||
|
||||
$ feynman valichord "study-id-or-topic"
|
||||
→ Runs the ValiChord reproducibility workflow or checks existing Harmony Records
|
||||
```
|
||||
|
||||
---
|
||||
@@ -100,7 +97,6 @@ Ask naturally or use slash commands as shortcuts.
|
||||
| `/review <artifact>` | Simulated peer review with severity and revision plan |
|
||||
| `/audit <item>` | Paper vs. codebase mismatch audit |
|
||||
| `/replicate <paper>` | Replicate experiments on local or cloud GPUs |
|
||||
| `/valichord <study-or-topic>` | Reproducibility attestation workflow and Harmony Record lookup |
|
||||
| `/compare <topic>` | Source comparison matrix |
|
||||
| `/draft <topic>` | Paper-style draft from research findings |
|
||||
| `/autoresearch <idea>` | Autonomous experiment loop |
|
||||
|
||||
4
package-lock.json
generated
4
package-lock.json
generated
@@ -1,12 +1,12 @@
|
||||
{
|
||||
"name": "@companion-ai/feynman",
|
||||
"version": "0.2.15",
|
||||
"version": "0.2.16",
|
||||
"lockfileVersion": 3,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "@companion-ai/feynman",
|
||||
"version": "0.2.15",
|
||||
"version": "0.2.16",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@companion-ai/alpha-hub": "^0.1.2",
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@companion-ai/feynman",
|
||||
"version": "0.2.15",
|
||||
"version": "0.2.16",
|
||||
"description": "Research-first CLI agent built on Pi and alphaXiv",
|
||||
"license": "MIT",
|
||||
"type": "module",
|
||||
|
||||
@@ -1,266 +0,0 @@
|
||||
---
|
||||
description: Submit a replication as a cryptographically verified ValiChord attestation, discover studies awaiting independent validation, query Harmony Records and reproducibility badges, or assist researchers in preparing a study for the validation pipeline.
|
||||
section: Research Workflows
|
||||
topLevelCli: true
|
||||
---
|
||||
|
||||
# ValiChord Validation Workflow
|
||||
|
||||
ValiChord is a distributed peer-to-peer system for scientific reproducibility verification, built on Holochain. It implements a blind commit-reveal protocol in Rust across four DNAs, producing Harmony Records — immutable, cryptographically verifiable proofs that independent parties reproduced the same findings without coordinating. Verified studies receive automatic reproducibility badges (Gold/Silver/Bronze); validators accumulate a per-discipline reputation score across rounds.
|
||||
|
||||
This workflow integrates Feynman at three levels: as a **validator agent** running the full commit-reveal protocol; as a **researcher's assistant** helping prepare a study for submission; and as a **query tool** surfacing reproducibility status during research.
|
||||
|
||||
**Live demo of the commit-reveal protocol**: https://youtu.be/DQ5wZSD1YEw
|
||||
|
||||
---
|
||||
|
||||
## ValiChord's four-DNA architecture
|
||||
|
||||
| DNA | Name | Type | Role |
|
||||
|-----|------|------|------|
|
||||
| 1 | Researcher Repository | Private, single-agent | Researcher's local archive. Stores study, pre-registered protocol, data snapshots, deviation declarations. Only SHA-256 hashes ever leave this DNA. |
|
||||
| 2 | Validator Workspace | Private, single-agent | Feynman's working space. Stores task privately. Seals the blind commitment here — content never propagates to the DHT. |
|
||||
| 3 | Attestation | Shared DHT | Coordination layer. Manages validation requests, validator profiles, study claims, commitment anchors, phase markers, and public attestations. 36 zome functions. |
|
||||
| 4 | Governance | Public DHT | Final record layer. Assembles HarmonyRecords, issues reproducibility badges, tracks validator reputation, records governance decisions. All read functions accessible via HTTP Gateway without running a node. |
|
||||
|
||||
The key guarantee: a validator's findings are cryptographically sealed (`SHA-256(msgpack(attestation) || nonce)`) before the reveal phase opens. Neither party can adjust findings after seeing the other's results. The researcher runs a parallel commit-reveal — locking their expected results before the validators reveal — so no party can adapt to seeing the other's outcome.
|
||||
|
||||
---
|
||||
|
||||
## Workflow A: Feynman as validator agent
|
||||
|
||||
### Step 0: Publish validator profile (one-time setup)
|
||||
|
||||
On first use, publish Feynman's public profile to DNA 3 so it appears in validator discovery indexes and conflict-of-interest checks:
|
||||
|
||||
```
|
||||
publish_validator_profile(profile: ValidatorProfile)
|
||||
```
|
||||
|
||||
Key fields:
|
||||
- `agent_type` — `AutomatedTool` (AI agents are first-class validators; the protocol makes no distinction between human and machine validators)
|
||||
- `disciplines` — list of disciplines Feynman can validate (e.g. ComputationalBiology, Statistics)
|
||||
- `certification_tier` — starts as `Provisional`; advances to `Certified` after 5+ validations with ≥60% agreement rate, `Senior` after 20+ with ≥80%
|
||||
|
||||
If a profile already exists, use `update_validator_profile` to merge changes.
|
||||
|
||||
### Step 1: Gather inputs or discover study
|
||||
|
||||
**If the user provides a `request_ref`**: use it directly.
|
||||
|
||||
**If Feynman is proactively discovering work**: query the pending queue in DNA 3:
|
||||
|
||||
```
|
||||
get_pending_requests_for_discipline(discipline: Discipline)
|
||||
```
|
||||
|
||||
Returns all unclaimed `ValidationRequest` entries for the discipline. Each contains:
|
||||
- `data_hash` — the ExternalHash identifier (used as `request_ref` throughout)
|
||||
- `num_validators_required` — quorum needed to close the round
|
||||
- `validation_tier` — Basic / Enhanced / Comprehensive
|
||||
- `access_urls` — where to fetch the data and code
|
||||
|
||||
Optionally assess study complexity before committing:
|
||||
|
||||
```
|
||||
assess_difficulty(input: AssessDifficultyInput)
|
||||
```
|
||||
|
||||
Scores code volume, dependency count, documentation quality, data accessibility, and environment complexity. Returns predicted duration and confidence. Use this to decide whether to proceed before claiming.
|
||||
|
||||
If replication results are not yet available, suggest `/replicate` first.
|
||||
|
||||
### Step 2: Claim the study
|
||||
|
||||
Before receiving a formal task assignment, register intent to validate via DNA 3:
|
||||
|
||||
```
|
||||
claim_study(request_ref: ExternalHash)
|
||||
```
|
||||
|
||||
This:
|
||||
- Reserves a validator slot (enforced capacity: no over-subscription)
|
||||
- Triggers conflict-of-interest check — rejects claim if Feynman's institution matches the researcher's
|
||||
- Records a `StudyClaim` entry on the shared DHT
|
||||
|
||||
If a claimed validator goes dark, any other validator can free the slot:
|
||||
|
||||
```
|
||||
reclaim_abandoned_claim(input: ReclaimInput)
|
||||
```
|
||||
|
||||
### Step 3: Receive task and seal private attestation — Commit phase
|
||||
|
||||
Connect to the ValiChord conductor via AppWebSocket. Using DNA 2 (Validator Workspace):
|
||||
|
||||
```
|
||||
receive_task(request_ref, discipline, deadline_secs, validation_focus, time_cap_secs, compensation_tier)
|
||||
```
|
||||
|
||||
`validation_focus` specifies which aspect Feynman is validating:
|
||||
- `ComputationalReproducibility` — re-run code, check numerical outputs
|
||||
- `PreCommitmentAdherence` — verify results match pre-registered analysis plan
|
||||
- `MethodologicalReview` — assess statistical choices and protocol validity
|
||||
|
||||
Then seal the private attestation — this is the blind commitment:
|
||||
|
||||
```
|
||||
seal_private_attestation(task_hash, attestation)
|
||||
```
|
||||
|
||||
Where `attestation` includes:
|
||||
- `outcome` — `Reproduced` / `PartiallyReproduced` / `FailedToReproduce` / `UnableToAssess`
|
||||
- `outcome_summary` — key metrics, effect direction, confidence interval overlap, overall agreement
|
||||
- `confidence` — High / Medium / Low
|
||||
- `time_invested_secs` and `time_breakdown` — environment_setup, data_acquisition, code_execution, troubleshooting
|
||||
- `computational_resources` — whether personal hardware, HPC, GPU, or cloud was required; estimated cost in pence
|
||||
- `deviation_flags` — any undeclared departures from the original protocol (type, severity, evidence)
|
||||
|
||||
The coordinator computes `commitment_hash = SHA-256(msgpack(attestation) || nonce)` and writes a `CommitmentAnchor` to DNA 3's shared DHT. The attestation content remains private in DNA 2.
|
||||
|
||||
Save `task_hash` and `commitment_hash` to `outputs/<slug>-valichord-commit.json`.
|
||||
|
||||
### Step 4: Wait for RevealOpen phase
|
||||
|
||||
Poll DNA 3 (Attestation) until the phase transitions:
|
||||
|
||||
```
|
||||
get_current_phase(request_ref: ExternalHash)
|
||||
```
|
||||
|
||||
Returns `null` (still commit phase), `"RevealOpen"`, or `"Complete"`. Poll every 30 seconds. The phase opens automatically when the `CommitmentAnchor` count reaches `num_validators_required` — no manual trigger required.
|
||||
|
||||
During this wait, the researcher also runs their parallel commit-reveal: they lock their expected results via `publish_researcher_commitment` before the reveal phase opens, then reveal via `reveal_researcher_result` after all validators have submitted. No party — researcher or validator — can adapt to seeing the other's outcome.
|
||||
|
||||
### Step 5: Submit attestation — Reveal phase
|
||||
|
||||
When phase is `RevealOpen`, publish the full attestation to the shared DHT via DNA 3:
|
||||
|
||||
```
|
||||
submit_attestation(attestation, nonce)
|
||||
```
|
||||
|
||||
The coordinator verifies `SHA-256(msgpack(attestation) || nonce) == CommitmentAnchor.commitment_hash` before writing. This prevents adaptive reveals — the attestation must match exactly what was committed.
|
||||
|
||||
### Step 6: Retrieve Harmony Record and badges
|
||||
|
||||
Call DNA 4 (Governance) explicitly after `submit_attestation` returns — DHT propagation means the ValidatorToAttestation link may not be visible within the same transaction:
|
||||
|
||||
```
|
||||
check_and_create_harmony_record(request_ref)
|
||||
get_harmony_record(request_ref)
|
||||
get_badges_for_study(request_ref)
|
||||
```
|
||||
|
||||
The **Harmony Record** contains:
|
||||
- `outcome` — the majority reproduced/not-reproduced finding
|
||||
- `agreement_level` — ExactMatch / WithinTolerance / DirectionalMatch / Divergent / UnableToAssess
|
||||
- `participating_validators` — array of validator agent keys
|
||||
- `validation_duration_secs`
|
||||
- `ActionHash` — the immutable on-chain identifier
|
||||
|
||||
**Reproducibility badges** are automatically issued when the Harmony Record is created:
|
||||
|
||||
| Badge | Threshold |
|
||||
|-------|-----------|
|
||||
| GoldReproducible | ≥7 validators, ≥90% agreement |
|
||||
| SilverReproducible | ≥5 validators, ≥70% agreement |
|
||||
| BronzeReproducible | ≥3 validators, ≥50% agreement |
|
||||
| FailedReproduction | Divergent outcomes |
|
||||
|
||||
Save the full record and badges to `outputs/<slug>-harmony-record.json`.
|
||||
|
||||
### Step 7: Check updated reputation
|
||||
|
||||
After each validation round, Feynman's reputation record in DNA 4 is updated:
|
||||
|
||||
```
|
||||
get_validator_reputation(validator: AgentPubKey)
|
||||
```
|
||||
|
||||
Returns per-discipline scores: total validations, agreement rate, average time, and current `CertificationTier` (Provisional → Certified → Senior). Reputation is a long-term asset — AI validators accumulate a cryptographically verifiable track record across all ValiChord rounds they participate in.
|
||||
|
||||
### Step 8: Report to user
|
||||
|
||||
Present:
|
||||
- Outcome and agreement level
|
||||
- Reproducibility badge(s) issued to the study
|
||||
- Feynman's updated reputation score for this discipline
|
||||
- ActionHash — the permanent public identifier for this Harmony Record
|
||||
- Confirmation that the record is written to the Governance DHT and accessible via HTTP Gateway without any special infrastructure
|
||||
- Path to saved outputs
|
||||
|
||||
---
|
||||
|
||||
## Workflow B: Query existing Harmony Record
|
||||
|
||||
`get_harmony_record` and `get_badges_for_study` in DNA 4 are `Unrestricted` functions — accessible via Holochain's HTTP Gateway without connecting to a conductor or running a node.
|
||||
|
||||
```
|
||||
GET <http_gateway_url>/get_harmony_record/<request_ref_b64>
|
||||
GET <http_gateway_url>/get_badges_for_study/<request_ref_b64>
|
||||
```
|
||||
|
||||
Use this to:
|
||||
- Check reproducibility status of a cited study during `/deepresearch`
|
||||
- Surface Harmony Records and badges in research summaries
|
||||
- Verify whether a study has undergone independent validation before recommending it
|
||||
|
||||
The following read functions are also unrestricted on DNA 3:
|
||||
`get_attestations_for_request`, `get_validators_for_discipline`, `get_pending_requests_for_discipline`, `get_validator_profile`, `get_current_phase`, `get_difficulty_assessment`, `get_researcher_reveal`
|
||||
|
||||
---
|
||||
|
||||
## Workflow C: Proactive discipline queue monitoring
|
||||
|
||||
Feynman can act as a standing validator for a discipline — periodically checking for new studies that need validation without waiting to be assigned:
|
||||
|
||||
```
|
||||
get_pending_requests_for_discipline(discipline: Discipline)
|
||||
```
|
||||
|
||||
Returns all unclaimed `ValidationRequest` entries. For each, optionally run `assess_difficulty` to estimate workload before claiming.
|
||||
|
||||
This enables Feynman to operate as an autonomous reproducibility agent: polling the queue, assessing difficulty, claiming appropriate studies, and running the full Workflow A cycle unsupervised.
|
||||
|
||||
---
|
||||
|
||||
## Workflow D: Researcher preparation assistant
|
||||
|
||||
Before a study enters the validation pipeline, Feynman can assist the researcher in preparing it via DNA 1 (Researcher Repository). This workflow runs on the researcher's side, not the validator's.
|
||||
|
||||
**Register the study:**
|
||||
```
|
||||
register_study(study: ResearchStudy)
|
||||
```
|
||||
|
||||
**Pre-register the analysis protocol** (immutable once written — creates a tamper-evident commitment to the analysis plan before data collection or validation begins):
|
||||
```
|
||||
register_protocol(input: RegisterProtocolInput)
|
||||
```
|
||||
|
||||
**Take a cryptographic data snapshot** (records a SHA-256 hash of the dataset at a point in time — proves data was not modified after validation began):
|
||||
```
|
||||
take_data_snapshot(input: TakeDataSnapshotInput)
|
||||
```
|
||||
|
||||
**Declare any deviations** from the pre-registered plan before the commit phase opens (pre-commit transparency):
|
||||
```
|
||||
declare_deviation(input: DeclareDeviationInput)
|
||||
```
|
||||
|
||||
Only hashes ever leave DNA 1 — the raw data and protocol text remain on the researcher's device.
|
||||
|
||||
**Repository Readiness Checker**: ValiChord also ships a standalone audit tool that scans a research repository for 30+ reproducibility failure modes before submission — missing dependency files, absolute paths, undeclared environment requirements, data documentation gaps, human-subjects data exposure risks, and more. Feynman is the natural interface for this tool: running the audit, interpreting findings in plain language, guiding the researcher through fixes, and confirming the repository meets the bar for independent validation. See: https://github.com/topeuph-ai/ValiChord
|
||||
|
||||
---
|
||||
|
||||
## Notes
|
||||
|
||||
- AI agents are first-class participants in ValiChord's protocol. Feynman can autonomously publish profiles, claim studies, seal attestations, wait for phase transitions, and submit reveals — the protocol makes no distinction between human and AI validators.
|
||||
- ValiChord's privacy guarantee is structural, not policy-based. DNA 1 (researcher data) and DNA 2 (validator workspace) are single-agent private DHTs — propagation to the shared network is architecturally impossible, not merely restricted.
|
||||
- All 72 zome functions across the four DNAs are callable via AppWebSocket. The 20+ `Unrestricted` read functions on DNA 3 and DNA 4 are additionally accessible via HTTP Gateway without any Holochain node.
|
||||
- If a validation round stalls due to validator dropout, `force_finalize_round` in DNA 4 closes it after a 7-day timeout with a reduced quorum, preventing indefinite blocking.
|
||||
- Live demo (full commit-reveal cycle, Harmony Record generated): https://youtu.be/DQ5wZSD1YEw
|
||||
- Running the demo: `bash demo/start.sh` in a GitHub Codespace, then open port 8888 publicly
|
||||
- ValiChord repo: https://github.com/topeuph-ai/ValiChord
|
||||
2
scripts/lib/pi-subagents-patch.d.mts
Normal file
2
scripts/lib/pi-subagents-patch.d.mts
Normal file
@@ -0,0 +1,2 @@
|
||||
export const PI_SUBAGENTS_PATCH_TARGETS: string[];
|
||||
export function patchPiSubagentsSource(relativePath: string, source: string): string;
|
||||
124
scripts/lib/pi-subagents-patch.mjs
Normal file
124
scripts/lib/pi-subagents-patch.mjs
Normal file
@@ -0,0 +1,124 @@
|
||||
export const PI_SUBAGENTS_PATCH_TARGETS = [
|
||||
"index.ts",
|
||||
"agents.ts",
|
||||
"artifacts.ts",
|
||||
"run-history.ts",
|
||||
"skills.ts",
|
||||
"chain-clarify.ts",
|
||||
];
|
||||
|
||||
const RESOLVE_PI_AGENT_DIR_HELPER = [
|
||||
"function resolvePiAgentDir(): string {",
|
||||
' const configured = process.env.PI_CODING_AGENT_DIR?.trim();',
|
||||
' if (!configured) return path.join(os.homedir(), ".pi", "agent");',
|
||||
' return configured.startsWith("~/") ? path.join(os.homedir(), configured.slice(2)) : configured;',
|
||||
"}",
|
||||
].join("\n");
|
||||
|
||||
function injectResolvePiAgentDirHelper(source) {
|
||||
if (source.includes("function resolvePiAgentDir(): string {")) {
|
||||
return source;
|
||||
}
|
||||
|
||||
const lines = source.split("\n");
|
||||
let insertAt = 0;
|
||||
let importSeen = false;
|
||||
let importOpen = false;
|
||||
|
||||
for (let index = 0; index < lines.length; index += 1) {
|
||||
const trimmed = lines[index].trim();
|
||||
if (!importSeen) {
|
||||
if (trimmed === "" || trimmed.startsWith("/**") || trimmed.startsWith("*") || trimmed.startsWith("*/")) {
|
||||
insertAt = index + 1;
|
||||
continue;
|
||||
}
|
||||
if (trimmed.startsWith("import ")) {
|
||||
importSeen = true;
|
||||
importOpen = !trimmed.endsWith(";");
|
||||
insertAt = index + 1;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
}
|
||||
|
||||
if (trimmed.startsWith("import ")) {
|
||||
importOpen = !trimmed.endsWith(";");
|
||||
insertAt = index + 1;
|
||||
continue;
|
||||
}
|
||||
if (importOpen) {
|
||||
if (trimmed.endsWith(";")) importOpen = false;
|
||||
insertAt = index + 1;
|
||||
continue;
|
||||
}
|
||||
if (trimmed === "") {
|
||||
insertAt = index + 1;
|
||||
continue;
|
||||
}
|
||||
insertAt = index;
|
||||
break;
|
||||
}
|
||||
|
||||
return [...lines.slice(0, insertAt), "", RESOLVE_PI_AGENT_DIR_HELPER, "", ...lines.slice(insertAt)].join("\n");
|
||||
}
|
||||
|
||||
function replaceAll(source, from, to) {
|
||||
return source.split(from).join(to);
|
||||
}
|
||||
|
||||
export function patchPiSubagentsSource(relativePath, source) {
|
||||
let patched = source;
|
||||
|
||||
switch (relativePath) {
|
||||
case "index.ts":
|
||||
patched = replaceAll(
|
||||
patched,
|
||||
'const configPath = path.join(os.homedir(), ".pi", "agent", "extensions", "subagent", "config.json");',
|
||||
'const configPath = path.join(resolvePiAgentDir(), "extensions", "subagent", "config.json");',
|
||||
);
|
||||
break;
|
||||
case "agents.ts":
|
||||
patched = replaceAll(
|
||||
patched,
|
||||
'const userDir = path.join(os.homedir(), ".pi", "agent", "agents");',
|
||||
'const userDir = path.join(resolvePiAgentDir(), "agents");',
|
||||
);
|
||||
break;
|
||||
case "artifacts.ts":
|
||||
patched = replaceAll(
|
||||
patched,
|
||||
'const sessionsBase = path.join(os.homedir(), ".pi", "agent", "sessions");',
|
||||
'const sessionsBase = path.join(resolvePiAgentDir(), "sessions");',
|
||||
);
|
||||
break;
|
||||
case "run-history.ts":
|
||||
patched = replaceAll(
|
||||
patched,
|
||||
'const HISTORY_PATH = path.join(os.homedir(), ".pi", "agent", "run-history.jsonl");',
|
||||
'const HISTORY_PATH = path.join(resolvePiAgentDir(), "run-history.jsonl");',
|
||||
);
|
||||
break;
|
||||
case "skills.ts":
|
||||
patched = replaceAll(
|
||||
patched,
|
||||
'const AGENT_DIR = path.join(os.homedir(), ".pi", "agent");',
|
||||
"const AGENT_DIR = resolvePiAgentDir();",
|
||||
);
|
||||
break;
|
||||
case "chain-clarify.ts":
|
||||
patched = replaceAll(
|
||||
patched,
|
||||
'const dir = path.join(os.homedir(), ".pi", "agent", "agents");',
|
||||
'const dir = path.join(resolvePiAgentDir(), "agents");',
|
||||
);
|
||||
break;
|
||||
default:
|
||||
return source;
|
||||
}
|
||||
|
||||
if (patched === source) {
|
||||
return source;
|
||||
}
|
||||
|
||||
return injectResolvePiAgentDirHelper(patched);
|
||||
}
|
||||
@@ -4,6 +4,7 @@ import { createRequire } from "node:module";
|
||||
import { dirname, resolve } from "node:path";
|
||||
import { fileURLToPath } from "node:url";
|
||||
import { FEYNMAN_LOGO_HTML } from "../logo.mjs";
|
||||
import { PI_SUBAGENTS_PATCH_TARGETS, patchPiSubagentsSource } from "./lib/pi-subagents-patch.mjs";
|
||||
|
||||
const here = dirname(fileURLToPath(import.meta.url));
|
||||
const appRoot = resolve(here, "..");
|
||||
@@ -54,6 +55,7 @@ const interactiveThemePath = piPackageRoot ? resolve(piPackageRoot, "dist", "mod
|
||||
const terminalPath = piTuiRoot ? resolve(piTuiRoot, "dist", "terminal.js") : null;
|
||||
const editorPath = piTuiRoot ? resolve(piTuiRoot, "dist", "components", "editor.js") : null;
|
||||
const workspaceRoot = resolve(appRoot, ".feynman", "npm", "node_modules");
|
||||
const piSubagentsRoot = resolve(workspaceRoot, "pi-subagents");
|
||||
const webAccessPath = resolve(workspaceRoot, "pi-web-access", "index.ts");
|
||||
const sessionSearchIndexerPath = resolve(
|
||||
workspaceRoot,
|
||||
@@ -243,6 +245,19 @@ function ensurePandoc() {
|
||||
|
||||
ensurePandoc();
|
||||
|
||||
if (existsSync(piSubagentsRoot)) {
|
||||
for (const relativePath of PI_SUBAGENTS_PATCH_TARGETS) {
|
||||
const entryPath = resolve(piSubagentsRoot, relativePath);
|
||||
if (!existsSync(entryPath)) continue;
|
||||
|
||||
const source = readFileSync(entryPath, "utf8");
|
||||
const patched = patchPiSubagentsSource(relativePath, source);
|
||||
if (patched !== source) {
|
||||
writeFileSync(entryPath, patched, "utf8");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (packageJsonPath && existsSync(packageJsonPath)) {
|
||||
const pkg = JSON.parse(readFileSync(packageJsonPath, "utf8"));
|
||||
if (pkg.piConfig?.name !== "feynman" || pkg.piConfig?.configDir !== ".feynman") {
|
||||
|
||||
@@ -1,19 +0,0 @@
|
||||
---
|
||||
name: valichord-validation
|
||||
description: Integrate with ValiChord to submit a replication as a cryptographically verified validator attestation, discover studies awaiting independent validation, query Harmony Records and reproducibility badges, or assist researchers in preparing a study for the validation pipeline. Feynman operates as a first-class AI validator — publishing a validator profile, claiming studies, running the blind commit-reveal protocol, and accumulating a verifiable per-discipline reputation. Also surfaces reproducibility status during /deepresearch and literature reviews via ValiChord's HTTP Gateway.
|
||||
---
|
||||
|
||||
# ValiChord Validation
|
||||
|
||||
Run the `/valichord` workflow. Read the prompt template at `prompts/valichord.md` for the full procedure.
|
||||
|
||||
ValiChord is a four-DNA Holochain system for scientific reproducibility verification. Feynman integrates at four points:
|
||||
- As a **validator agent** — running `/replicate` then submitting findings as a sealed attestation into the blind commit-reveal protocol, earning reproducibility badges for researchers and building Feynman's own verifiable per-discipline reputation (Provisional → Certified → Senior)
|
||||
- As a **proactive discovery agent** — querying the pending study queue by discipline, assessing difficulty, and autonomously claiming appropriate validation work without waiting to be assigned
|
||||
- As a **researcher's assistant** — helping prepare studies for submission: registering protocols, taking cryptographic data snapshots, and running the Repository Readiness Checker to identify and fix reproducibility failure modes before validation begins
|
||||
- As a **research query tool** — checking whether a study carries a Harmony Record or reproducibility badge (Gold/Silver/Bronze) via ValiChord's HTTP Gateway, for use during `/deepresearch` or literature reviews
|
||||
|
||||
Output: a Harmony Record — an immutable, publicly accessible cryptographic proof of independent reproducibility written to the ValiChord Governance DHT — plus automatic badge issuance and an updated validator reputation score.
|
||||
|
||||
Live demo (commit-reveal cycle end-to-end): https://youtu.be/DQ5wZSD1YEw
|
||||
ValiChord repo: https://github.com/topeuph-ai/ValiChord
|
||||
104
tests/pi-subagents-patch.test.ts
Normal file
104
tests/pi-subagents-patch.test.ts
Normal file
@@ -0,0 +1,104 @@
|
||||
import test from "node:test";
|
||||
import assert from "node:assert/strict";
|
||||
|
||||
import { patchPiSubagentsSource } from "../scripts/lib/pi-subagents-patch.mjs";
|
||||
|
||||
const CASES = [
|
||||
{
|
||||
name: "index.ts config path",
|
||||
file: "index.ts",
|
||||
input: [
|
||||
'import * as os from "node:os";',
|
||||
'import * as path from "node:path";',
|
||||
'const configPath = path.join(os.homedir(), ".pi", "agent", "extensions", "subagent", "config.json");',
|
||||
"",
|
||||
].join("\n"),
|
||||
original: 'const configPath = path.join(os.homedir(), ".pi", "agent", "extensions", "subagent", "config.json");',
|
||||
expected: 'const configPath = path.join(resolvePiAgentDir(), "extensions", "subagent", "config.json");',
|
||||
},
|
||||
{
|
||||
name: "agents.ts user agents dir",
|
||||
file: "agents.ts",
|
||||
input: [
|
||||
'import * as os from "node:os";',
|
||||
'import * as path from "node:path";',
|
||||
'const userDir = path.join(os.homedir(), ".pi", "agent", "agents");',
|
||||
"",
|
||||
].join("\n"),
|
||||
original: 'const userDir = path.join(os.homedir(), ".pi", "agent", "agents");',
|
||||
expected: 'const userDir = path.join(resolvePiAgentDir(), "agents");',
|
||||
},
|
||||
{
|
||||
name: "artifacts.ts sessions dir",
|
||||
file: "artifacts.ts",
|
||||
input: [
|
||||
'import * as os from "node:os";',
|
||||
'import * as path from "node:path";',
|
||||
'const sessionsBase = path.join(os.homedir(), ".pi", "agent", "sessions");',
|
||||
"",
|
||||
].join("\n"),
|
||||
original: 'const sessionsBase = path.join(os.homedir(), ".pi", "agent", "sessions");',
|
||||
expected: 'const sessionsBase = path.join(resolvePiAgentDir(), "sessions");',
|
||||
},
|
||||
{
|
||||
name: "run-history.ts history file",
|
||||
file: "run-history.ts",
|
||||
input: [
|
||||
'import * as os from "node:os";',
|
||||
'import * as path from "node:path";',
|
||||
'const HISTORY_PATH = path.join(os.homedir(), ".pi", "agent", "run-history.jsonl");',
|
||||
"",
|
||||
].join("\n"),
|
||||
original: 'const HISTORY_PATH = path.join(os.homedir(), ".pi", "agent", "run-history.jsonl");',
|
||||
expected: 'const HISTORY_PATH = path.join(resolvePiAgentDir(), "run-history.jsonl");',
|
||||
},
|
||||
{
|
||||
name: "skills.ts agent dir",
|
||||
file: "skills.ts",
|
||||
input: [
|
||||
'import * as os from "node:os";',
|
||||
'import * as path from "node:path";',
|
||||
'const AGENT_DIR = path.join(os.homedir(), ".pi", "agent");',
|
||||
"",
|
||||
].join("\n"),
|
||||
original: 'const AGENT_DIR = path.join(os.homedir(), ".pi", "agent");',
|
||||
expected: "const AGENT_DIR = resolvePiAgentDir();",
|
||||
},
|
||||
{
|
||||
name: "chain-clarify.ts chain save dir",
|
||||
file: "chain-clarify.ts",
|
||||
input: [
|
||||
'import * as os from "node:os";',
|
||||
'import * as path from "node:path";',
|
||||
'const dir = path.join(os.homedir(), ".pi", "agent", "agents");',
|
||||
"",
|
||||
].join("\n"),
|
||||
original: 'const dir = path.join(os.homedir(), ".pi", "agent", "agents");',
|
||||
expected: 'const dir = path.join(resolvePiAgentDir(), "agents");',
|
||||
},
|
||||
];
|
||||
|
||||
for (const scenario of CASES) {
|
||||
test(`patchPiSubagentsSource rewrites ${scenario.name}`, () => {
|
||||
const patched = patchPiSubagentsSource(scenario.file, scenario.input);
|
||||
|
||||
assert.match(patched, /function resolvePiAgentDir\(\): string \{/);
|
||||
assert.match(patched, /process\.env\.PI_CODING_AGENT_DIR\?\.trim\(\)/);
|
||||
assert.ok(patched.includes(scenario.expected));
|
||||
assert.ok(!patched.includes(scenario.original));
|
||||
});
|
||||
}
|
||||
|
||||
test("patchPiSubagentsSource is idempotent", () => {
|
||||
const input = [
|
||||
'import * as os from "node:os";',
|
||||
'import * as path from "node:path";',
|
||||
'const configPath = path.join(os.homedir(), ".pi", "agent", "extensions", "subagent", "config.json");',
|
||||
"",
|
||||
].join("\n");
|
||||
|
||||
const once = patchPiSubagentsSource("index.ts", input);
|
||||
const twice = patchPiSubagentsSource("index.ts", once);
|
||||
|
||||
assert.equal(twice, once);
|
||||
});
|
||||
@@ -62,13 +62,13 @@ These installers download only the `skills/` tree from the Feynman repository. T
|
||||
The one-line installer already targets the latest tagged release. To pin an exact version, pass it explicitly:
|
||||
|
||||
```bash
|
||||
curl -fsSL https://feynman.is/install | bash -s -- 0.2.15
|
||||
curl -fsSL https://feynman.is/install | bash -s -- 0.2.16
|
||||
```
|
||||
|
||||
On Windows:
|
||||
|
||||
```powershell
|
||||
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.15
|
||||
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.16
|
||||
```
|
||||
|
||||
## pnpm
|
||||
|
||||
Reference in New Issue
Block a user