Compare commits
3 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
ca559dfd91 | ||
|
|
46b2aa93d0 | ||
|
|
043e241464 |
@@ -17,7 +17,7 @@ You receive a draft document and the research files it was built from. Your job
|
|||||||
4. **Remove unsourced claims** — if a factual claim in the draft cannot be traced to any source in the research files, either find a source for it or remove it. Do not leave unsourced factual claims.
|
4. **Remove unsourced claims** — if a factual claim in the draft cannot be traced to any source in the research files, either find a source for it or remove it. Do not leave unsourced factual claims.
|
||||||
5. **Verify meaning, not just topic overlap.** A citation is valid only if the source actually supports the specific number, quote, or conclusion attached to it.
|
5. **Verify meaning, not just topic overlap.** A citation is valid only if the source actually supports the specific number, quote, or conclusion attached to it.
|
||||||
6. **Refuse fake certainty.** Do not use words like `verified`, `confirmed`, or `reproduced` unless the draft already contains or the research files provide the underlying evidence.
|
6. **Refuse fake certainty.** Do not use words like `verified`, `confirmed`, or `reproduced` unless the draft already contains or the research files provide the underlying evidence.
|
||||||
7. **Never invent or keep fabricated results.** If any image, figure, chart, table, benchmark, score, dataset, sample size, ablation, or experimental result lacks explicit provenance, remove it or replace it with a clearly labeled TODO. Never keep a made-up result because it “looks plausible.”
|
7. **Enforce the system prompt's provenance rule.** Unsupported results, figures, charts, tables, benchmarks, and quantitative claims must be removed or converted to TODOs.
|
||||||
|
|
||||||
## Citation rules
|
## Citation rules
|
||||||
|
|
||||||
@@ -41,7 +41,7 @@ For code-backed or quantitative claims:
|
|||||||
- Treat captions such as “illustrative,” “simulated,” “representative,” or “example” as insufficient unless the user explicitly requested synthetic/example data. Otherwise remove the visual and mark the missing experiment.
|
- Treat captions such as “illustrative,” “simulated,” “representative,” or “example” as insufficient unless the user explicitly requested synthetic/example data. Otherwise remove the visual and mark the missing experiment.
|
||||||
- Do not preserve polished summaries that outrun the raw evidence.
|
- Do not preserve polished summaries that outrun the raw evidence.
|
||||||
|
|
||||||
## Fabrication audit
|
## Result provenance audit
|
||||||
|
|
||||||
Before saving the final document, scan for:
|
Before saving the final document, scan for:
|
||||||
- numeric scores or percentages,
|
- numeric scores or percentages,
|
||||||
|
|||||||
@@ -15,7 +15,7 @@ You are Feynman's writing subagent.
|
|||||||
3. **Be explicit about gaps.** If the research files have unresolved questions or conflicting evidence, surface them — do not paper over them.
|
3. **Be explicit about gaps.** If the research files have unresolved questions or conflicting evidence, surface them — do not paper over them.
|
||||||
4. **Do not promote draft text into fact.** If a result is tentative, inferred, or awaiting verification, label it that way in the prose.
|
4. **Do not promote draft text into fact.** If a result is tentative, inferred, or awaiting verification, label it that way in the prose.
|
||||||
5. **No aesthetic laundering.** Do not make plots, tables, or summaries look cleaner than the underlying evidence justifies.
|
5. **No aesthetic laundering.** Do not make plots, tables, or summaries look cleaner than the underlying evidence justifies.
|
||||||
6. **Never fabricate results.** Do not invent experimental scores, datasets, sample sizes, ablations, benchmark tables, charts, image captions, or figures. If evidence is missing, write `No results are available yet` or `TODO: run experiment` rather than producing plausible-looking data.
|
6. **Follow the system prompt's provenance rule.** Missing results become gaps or TODOs, never plausible-looking data.
|
||||||
|
|
||||||
## Output structure
|
## Output structure
|
||||||
|
|
||||||
@@ -50,7 +50,7 @@ Unresolved issues, disagreements between sources, gaps in evidence.
|
|||||||
- Do NOT add inline citations — the verifier agent handles that as a separate post-processing step.
|
- Do NOT add inline citations — the verifier agent handles that as a separate post-processing step.
|
||||||
- Do NOT add a Sources section — the verifier agent builds that.
|
- Do NOT add a Sources section — the verifier agent builds that.
|
||||||
- Before finishing, do a claim sweep: every strong factual statement in the draft should have an obvious source home in the research files.
|
- Before finishing, do a claim sweep: every strong factual statement in the draft should have an obvious source home in the research files.
|
||||||
- Before finishing, do a fake-result sweep: remove or replace any numeric result, figure, chart, benchmark, table, or image that lacks explicit provenance.
|
- Before finishing, do a result-provenance sweep for numeric results, figures, charts, benchmarks, tables, and images.
|
||||||
|
|
||||||
## Output contract
|
## Output contract
|
||||||
- Save the main artifact to the specified output path (default: `draft.md`).
|
- Save the main artifact to the specified output path (default: `draft.md`).
|
||||||
|
|||||||
3
.github/workflows/publish.yml
vendored
3
.github/workflows/publish.yml
vendored
@@ -29,7 +29,8 @@ jobs:
|
|||||||
run: |
|
run: |
|
||||||
LOCAL=$(node -p "require('./package.json').version")
|
LOCAL=$(node -p "require('./package.json').version")
|
||||||
echo "version=$LOCAL" >> "$GITHUB_OUTPUT"
|
echo "version=$LOCAL" >> "$GITHUB_OUTPUT"
|
||||||
if gh release view "v$LOCAL" >/dev/null 2>&1; then
|
PUBLISHED=$(npm view @companion-ai/feynman version 2>/dev/null || true)
|
||||||
|
if [ "$PUBLISHED" = "$LOCAL" ] || gh release view "v$LOCAL" >/dev/null 2>&1; then
|
||||||
echo "should_release=false" >> "$GITHUB_OUTPUT"
|
echo "should_release=false" >> "$GITHUB_OUTPUT"
|
||||||
else
|
else
|
||||||
echo "should_release=true" >> "$GITHUB_OUTPUT"
|
echo "should_release=true" >> "$GITHUB_OUTPUT"
|
||||||
|
|||||||
@@ -25,7 +25,7 @@ curl -fsSL https://feynman.is/install | bash
|
|||||||
irm https://feynman.is/install.ps1 | iex
|
irm https://feynman.is/install.ps1 | iex
|
||||||
```
|
```
|
||||||
|
|
||||||
The one-line installer fetches the latest tagged release. To pin a version, pass it explicitly, for example `curl -fsSL https://feynman.is/install | bash -s -- 0.2.20`.
|
The one-line installer fetches the latest tagged release. To pin a version, pass it explicitly, for example `curl -fsSL https://feynman.is/install | bash -s -- 0.2.21`.
|
||||||
|
|
||||||
The installer downloads a standalone native bundle with its own Node.js runtime.
|
The installer downloads a standalone native bundle with its own Node.js runtime.
|
||||||
|
|
||||||
|
|||||||
4
package-lock.json
generated
4
package-lock.json
generated
@@ -1,12 +1,12 @@
|
|||||||
{
|
{
|
||||||
"name": "@companion-ai/feynman",
|
"name": "@companion-ai/feynman",
|
||||||
"version": "0.2.20",
|
"version": "0.2.21",
|
||||||
"lockfileVersion": 3,
|
"lockfileVersion": 3,
|
||||||
"requires": true,
|
"requires": true,
|
||||||
"packages": {
|
"packages": {
|
||||||
"": {
|
"": {
|
||||||
"name": "@companion-ai/feynman",
|
"name": "@companion-ai/feynman",
|
||||||
"version": "0.2.20",
|
"version": "0.2.21",
|
||||||
"hasInstallScript": true,
|
"hasInstallScript": true,
|
||||||
"license": "MIT",
|
"license": "MIT",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "@companion-ai/feynman",
|
"name": "@companion-ai/feynman",
|
||||||
"version": "0.2.20",
|
"version": "0.2.21",
|
||||||
"description": "Research-first CLI agent built on Pi and alphaXiv",
|
"description": "Research-first CLI agent built on Pi and alphaXiv",
|
||||||
"license": "MIT",
|
"license": "MIT",
|
||||||
"type": "module",
|
"type": "module",
|
||||||
|
|||||||
@@ -110,7 +110,7 @@ This usually means the release exists, but not all platform bundles were uploade
|
|||||||
Workarounds:
|
Workarounds:
|
||||||
- try again after the release finishes publishing
|
- try again after the release finishes publishing
|
||||||
- pass the latest published version explicitly, e.g.:
|
- pass the latest published version explicitly, e.g.:
|
||||||
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.20
|
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.21
|
||||||
"@
|
"@
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -261,7 +261,7 @@ This usually means the release exists, but not all platform bundles were uploade
|
|||||||
Workarounds:
|
Workarounds:
|
||||||
- try again after the release finishes publishing
|
- try again after the release finishes publishing
|
||||||
- pass the latest published version explicitly, e.g.:
|
- pass the latest published version explicitly, e.g.:
|
||||||
curl -fsSL https://feynman.is/install | bash -s -- 0.2.20
|
curl -fsSL https://feynman.is/install | bash -s -- 0.2.21
|
||||||
EOF
|
EOF
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|||||||
@@ -260,6 +260,23 @@ function ensureParentDir(path) {
|
|||||||
mkdirSync(dirname(path), { recursive: true });
|
mkdirSync(dirname(path), { recursive: true });
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function packageDependencyExists(packagePath, globalNodeModulesRoot, dependency) {
|
||||||
|
return existsSync(resolve(packagePath, "node_modules", dependency)) ||
|
||||||
|
existsSync(resolve(globalNodeModulesRoot, dependency));
|
||||||
|
}
|
||||||
|
|
||||||
|
function installedPackageLooksUsable(packagePath, globalNodeModulesRoot) {
|
||||||
|
if (!existsSync(resolve(packagePath, "package.json"))) return false;
|
||||||
|
try {
|
||||||
|
const pkg = JSON.parse(readFileSync(resolve(packagePath, "package.json"), "utf8"));
|
||||||
|
return Object.keys(pkg.dependencies ?? {}).every((dependency) =>
|
||||||
|
packageDependencyExists(packagePath, globalNodeModulesRoot, dependency)
|
||||||
|
);
|
||||||
|
} catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
function linkPointsTo(linkPath, targetPath) {
|
function linkPointsTo(linkPath, targetPath) {
|
||||||
try {
|
try {
|
||||||
if (!lstatSync(linkPath).isSymbolicLink()) return false;
|
if (!lstatSync(linkPath).isSymbolicLink()) return false;
|
||||||
@@ -281,6 +298,8 @@ function ensureBundledPackageLinks(packageSpecs) {
|
|||||||
try {
|
try {
|
||||||
if (lstatSync(targetPath).isSymbolicLink()) {
|
if (lstatSync(targetPath).isSymbolicLink()) {
|
||||||
rmSync(targetPath, { force: true });
|
rmSync(targetPath, { force: true });
|
||||||
|
} else if (!installedPackageLooksUsable(targetPath, globalNodeModulesRoot)) {
|
||||||
|
rmSync(targetPath, { recursive: true, force: true });
|
||||||
}
|
}
|
||||||
} catch {}
|
} catch {}
|
||||||
if (existsSync(targetPath)) continue;
|
if (existsSync(targetPath)) continue;
|
||||||
|
|||||||
@@ -1,11 +1,41 @@
|
|||||||
import { dirname, resolve } from "node:path";
|
import { dirname, resolve } from "node:path";
|
||||||
|
|
||||||
import { AuthStorage, ModelRegistry } from "@mariozechner/pi-coding-agent";
|
import { AuthStorage, ModelRegistry } from "@mariozechner/pi-coding-agent";
|
||||||
|
import { getModels } from "@mariozechner/pi-ai";
|
||||||
|
import { anthropicOAuthProvider } from "@mariozechner/pi-ai/oauth";
|
||||||
|
|
||||||
export function getModelsJsonPath(authPath: string): string {
|
export function getModelsJsonPath(authPath: string): string {
|
||||||
return resolve(dirname(authPath), "models.json");
|
return resolve(dirname(authPath), "models.json");
|
||||||
}
|
}
|
||||||
|
|
||||||
export function createModelRegistry(authPath: string): ModelRegistry {
|
function registerFeynmanModelOverlays(modelRegistry: ModelRegistry): void {
|
||||||
return ModelRegistry.create(AuthStorage.create(authPath), getModelsJsonPath(authPath));
|
const anthropicModels = getModels("anthropic");
|
||||||
|
if (anthropicModels.some((model) => model.id === "claude-opus-4-7")) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const opus46 = anthropicModels.find((model) => model.id === "claude-opus-4-6");
|
||||||
|
if (!opus46) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
modelRegistry.registerProvider("anthropic", {
|
||||||
|
baseUrl: "https://api.anthropic.com",
|
||||||
|
api: "anthropic-messages",
|
||||||
|
oauth: anthropicOAuthProvider,
|
||||||
|
models: [
|
||||||
|
...anthropicModels,
|
||||||
|
{
|
||||||
|
...opus46,
|
||||||
|
id: "claude-opus-4-7",
|
||||||
|
name: "Claude Opus 4.7",
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export function createModelRegistry(authPath: string): ModelRegistry {
|
||||||
|
const registry = ModelRegistry.create(AuthStorage.create(authPath), getModelsJsonPath(authPath));
|
||||||
|
registerFeynmanModelOverlays(registry);
|
||||||
|
return registry;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
import { spawn } from "node:child_process";
|
import { spawn } from "node:child_process";
|
||||||
import { cpSync, existsSync, lstatSync, mkdirSync, readlinkSync, rmSync, symlinkSync, writeFileSync } from "node:fs";
|
import { cpSync, existsSync, lstatSync, mkdirSync, readFileSync, readlinkSync, rmSync, symlinkSync, writeFileSync } from "node:fs";
|
||||||
import { fileURLToPath } from "node:url";
|
import { fileURLToPath } from "node:url";
|
||||||
import { dirname, join, resolve } from "node:path";
|
import { dirname, join, resolve } from "node:path";
|
||||||
|
|
||||||
@@ -423,6 +423,47 @@ function linkDirectory(linkPath: string, targetPath: string): void {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function packageNameToPath(root: string, packageName: string): string {
|
||||||
|
return resolve(root, packageName);
|
||||||
|
}
|
||||||
|
|
||||||
|
function packageDependencyExists(packagePath: string, globalNodeModulesRoot: string, dependency: string): boolean {
|
||||||
|
return existsSync(packageNameToPath(resolve(packagePath, "node_modules"), dependency)) ||
|
||||||
|
existsSync(packageNameToPath(globalNodeModulesRoot, dependency));
|
||||||
|
}
|
||||||
|
|
||||||
|
function installedPackageLooksUsable(packagePath: string, globalNodeModulesRoot: string): boolean {
|
||||||
|
if (!existsSync(resolve(packagePath, "package.json"))) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const pkg = JSON.parse(readFileSync(resolve(packagePath, "package.json"), "utf8")) as {
|
||||||
|
dependencies?: Record<string, string>;
|
||||||
|
};
|
||||||
|
const dependencies = Object.keys(pkg.dependencies ?? {});
|
||||||
|
return dependencies.every((dependency) => packageDependencyExists(packagePath, globalNodeModulesRoot, dependency));
|
||||||
|
} catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function replaceBrokenPackageWithBundledCopy(targetPath: string, bundledPackagePath: string, globalNodeModulesRoot: string): boolean {
|
||||||
|
if (!existsSync(targetPath)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
if (pathsMatchSymlinkTarget(targetPath, bundledPackagePath)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
if (installedPackageLooksUsable(targetPath, globalNodeModulesRoot)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
rmSync(targetPath, { recursive: true, force: true });
|
||||||
|
linkDirectory(targetPath, bundledPackagePath);
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
export function seedBundledWorkspacePackages(
|
export function seedBundledWorkspacePackages(
|
||||||
agentDir: string,
|
agentDir: string,
|
||||||
appRoot: string,
|
appRoot: string,
|
||||||
@@ -446,6 +487,10 @@ export function seedBundledWorkspacePackages(
|
|||||||
if (!existsSync(bundledPackagePath)) continue;
|
if (!existsSync(bundledPackagePath)) continue;
|
||||||
|
|
||||||
const targetPath = resolve(globalNodeModulesRoot, parsed.name);
|
const targetPath = resolve(globalNodeModulesRoot, parsed.name);
|
||||||
|
if (replaceBrokenPackageWithBundledCopy(targetPath, bundledPackagePath, globalNodeModulesRoot)) {
|
||||||
|
seeded.push(source);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
if (!existsSync(targetPath)) {
|
if (!existsSync(targetPath)) {
|
||||||
linkDirectory(targetPath, bundledPackagePath);
|
linkDirectory(targetPath, bundledPackagePath);
|
||||||
seeded.push(source);
|
seeded.push(source);
|
||||||
|
|||||||
@@ -39,14 +39,20 @@ test("research writing prompts forbid fabricated results and unproven figures",
|
|||||||
|
|
||||||
for (const [label, content] of [
|
for (const [label, content] of [
|
||||||
["system prompt", systemPrompt],
|
["system prompt", systemPrompt],
|
||||||
["writer prompt", writerPrompt],
|
|
||||||
["verifier prompt", verifierPrompt],
|
|
||||||
] as const) {
|
] as const) {
|
||||||
assert.match(content, /Never (invent|fabricate)/i, `${label} must explicitly forbid invented or fabricated results`);
|
assert.match(content, /Never (invent|fabricate)/i, `${label} must explicitly forbid invented or fabricated results`);
|
||||||
assert.match(content, /(figure|chart|image|table)/i, `${label} must cover visual/table provenance`);
|
assert.match(content, /(figure|chart|image|table)/i, `${label} must cover visual/table provenance`);
|
||||||
assert.match(content, /(provenance|source|artifact|script|raw)/i, `${label} must require traceable support`);
|
assert.match(content, /(provenance|source|artifact|script|raw)/i, `${label} must require traceable support`);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
for (const [label, content] of [
|
||||||
|
["writer prompt", writerPrompt],
|
||||||
|
["verifier prompt", verifierPrompt],
|
||||||
|
["draft prompt", draftPrompt],
|
||||||
|
] as const) {
|
||||||
|
assert.match(content, /system prompt.*provenance rule/i, `${label} must point back to the system provenance rule`);
|
||||||
|
}
|
||||||
|
|
||||||
assert.match(draftPrompt, /system prompt's provenance rules/i);
|
assert.match(draftPrompt, /system prompt's provenance rules/i);
|
||||||
assert.match(draftPrompt, /placeholder or proposed experimental plan/i);
|
assert.match(draftPrompt, /placeholder or proposed experimental plan/i);
|
||||||
assert.match(draftPrompt, /source-backed quantitative data/i);
|
assert.match(draftPrompt, /source-backed quantitative data/i);
|
||||||
|
|||||||
@@ -7,6 +7,7 @@ import { join } from "node:path";
|
|||||||
import { resolveInitialPrompt, shouldRunInteractiveSetup } from "../src/cli.js";
|
import { resolveInitialPrompt, shouldRunInteractiveSetup } from "../src/cli.js";
|
||||||
import { buildModelStatusSnapshotFromRecords, chooseRecommendedModel } from "../src/model/catalog.js";
|
import { buildModelStatusSnapshotFromRecords, chooseRecommendedModel } from "../src/model/catalog.js";
|
||||||
import { resolveModelProviderForCommand, setDefaultModelSpec } from "../src/model/commands.js";
|
import { resolveModelProviderForCommand, setDefaultModelSpec } from "../src/model/commands.js";
|
||||||
|
import { createModelRegistry } from "../src/model/registry.js";
|
||||||
|
|
||||||
function createAuthPath(contents: Record<string, unknown>): string {
|
function createAuthPath(contents: Record<string, unknown>): string {
|
||||||
const root = mkdtempSync(join(tmpdir(), "feynman-auth-"));
|
const root = mkdtempSync(join(tmpdir(), "feynman-auth-"));
|
||||||
@@ -26,6 +27,17 @@ test("chooseRecommendedModel prefers the strongest authenticated research model"
|
|||||||
assert.equal(recommendation?.spec, "anthropic/claude-opus-4-6");
|
assert.equal(recommendation?.spec, "anthropic/claude-opus-4-6");
|
||||||
});
|
});
|
||||||
|
|
||||||
|
test("createModelRegistry overlays new Anthropic Opus model before upstream Pi updates", () => {
|
||||||
|
const authPath = createAuthPath({
|
||||||
|
anthropic: { type: "api_key", key: "anthropic-test-key" },
|
||||||
|
});
|
||||||
|
|
||||||
|
const registry = createModelRegistry(authPath);
|
||||||
|
|
||||||
|
assert.ok(registry.find("anthropic", "claude-opus-4-7"));
|
||||||
|
assert.equal(registry.getAvailable().some((model) => model.provider === "anthropic" && model.id === "claude-opus-4-7"), true);
|
||||||
|
});
|
||||||
|
|
||||||
test("setDefaultModelSpec accepts a unique bare model id from authenticated models", () => {
|
test("setDefaultModelSpec accepts a unique bare model id from authenticated models", () => {
|
||||||
const authPath = createAuthPath({
|
const authPath = createAuthPath({
|
||||||
openai: { type: "api_key", key: "openai-test-key" },
|
openai: { type: "api_key", key: "openai-test-key" },
|
||||||
|
|||||||
@@ -6,13 +6,17 @@ import { join, resolve } from "node:path";
|
|||||||
|
|
||||||
import { installPackageSources, seedBundledWorkspacePackages, updateConfiguredPackages } from "../src/pi/package-ops.js";
|
import { installPackageSources, seedBundledWorkspacePackages, updateConfiguredPackages } from "../src/pi/package-ops.js";
|
||||||
|
|
||||||
function createBundledWorkspace(appRoot: string, packageNames: string[]): void {
|
function createBundledWorkspace(
|
||||||
|
appRoot: string,
|
||||||
|
packageNames: string[],
|
||||||
|
dependenciesByPackage: Record<string, Record<string, string>> = {},
|
||||||
|
): void {
|
||||||
for (const packageName of packageNames) {
|
for (const packageName of packageNames) {
|
||||||
const packageDir = resolve(appRoot, ".feynman", "npm", "node_modules", packageName);
|
const packageDir = resolve(appRoot, ".feynman", "npm", "node_modules", packageName);
|
||||||
mkdirSync(packageDir, { recursive: true });
|
mkdirSync(packageDir, { recursive: true });
|
||||||
writeFileSync(
|
writeFileSync(
|
||||||
join(packageDir, "package.json"),
|
join(packageDir, "package.json"),
|
||||||
JSON.stringify({ name: packageName, version: "1.0.0" }, null, 2) + "\n",
|
JSON.stringify({ name: packageName, version: "1.0.0", dependencies: dependenciesByPackage[packageName] }, null, 2) + "\n",
|
||||||
"utf8",
|
"utf8",
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
@@ -76,6 +80,33 @@ test("seedBundledWorkspacePackages preserves existing installed packages", () =>
|
|||||||
assert.equal(lstatSync(existingPackageDir).isSymbolicLink(), false);
|
assert.equal(lstatSync(existingPackageDir).isSymbolicLink(), false);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
test("seedBundledWorkspacePackages repairs broken existing bundled packages", () => {
|
||||||
|
const appRoot = mkdtempSync(join(tmpdir(), "feynman-bundle-"));
|
||||||
|
const homeRoot = mkdtempSync(join(tmpdir(), "feynman-home-"));
|
||||||
|
const agentDir = resolve(homeRoot, "agent");
|
||||||
|
const existingPackageDir = resolve(homeRoot, "npm-global", "lib", "node_modules", "pi-markdown-preview");
|
||||||
|
|
||||||
|
mkdirSync(agentDir, { recursive: true });
|
||||||
|
createBundledWorkspace(appRoot, ["pi-markdown-preview", "puppeteer-core"], {
|
||||||
|
"pi-markdown-preview": { "puppeteer-core": "^24.0.0" },
|
||||||
|
});
|
||||||
|
mkdirSync(existingPackageDir, { recursive: true });
|
||||||
|
writeFileSync(
|
||||||
|
resolve(existingPackageDir, "package.json"),
|
||||||
|
JSON.stringify({ name: "pi-markdown-preview", version: "broken", dependencies: { "puppeteer-core": "^24.0.0" } }) + "\n",
|
||||||
|
"utf8",
|
||||||
|
);
|
||||||
|
|
||||||
|
const seeded = seedBundledWorkspacePackages(agentDir, appRoot, ["npm:pi-markdown-preview"]);
|
||||||
|
|
||||||
|
assert.deepEqual(seeded, ["npm:pi-markdown-preview"]);
|
||||||
|
assert.equal(lstatSync(existingPackageDir).isSymbolicLink(), true);
|
||||||
|
assert.equal(
|
||||||
|
readFileSync(resolve(existingPackageDir, "package.json"), "utf8").includes('"version": "1.0.0"'),
|
||||||
|
true,
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
test("installPackageSources filters noisy npm chatter but preserves meaningful output", async () => {
|
test("installPackageSources filters noisy npm chatter but preserves meaningful output", async () => {
|
||||||
const root = mkdtempSync(join(tmpdir(), "feynman-package-ops-"));
|
const root = mkdtempSync(join(tmpdir(), "feynman-package-ops-"));
|
||||||
const workingDir = resolve(root, "project");
|
const workingDir = resolve(root, "project");
|
||||||
|
|||||||
@@ -261,7 +261,7 @@ This usually means the release exists, but not all platform bundles were uploade
|
|||||||
Workarounds:
|
Workarounds:
|
||||||
- try again after the release finishes publishing
|
- try again after the release finishes publishing
|
||||||
- pass the latest published version explicitly, e.g.:
|
- pass the latest published version explicitly, e.g.:
|
||||||
curl -fsSL https://feynman.is/install | bash -s -- 0.2.20
|
curl -fsSL https://feynman.is/install | bash -s -- 0.2.21
|
||||||
EOF
|
EOF
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|||||||
@@ -110,7 +110,7 @@ This usually means the release exists, but not all platform bundles were uploade
|
|||||||
Workarounds:
|
Workarounds:
|
||||||
- try again after the release finishes publishing
|
- try again after the release finishes publishing
|
||||||
- pass the latest published version explicitly, e.g.:
|
- pass the latest published version explicitly, e.g.:
|
||||||
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.20
|
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.21
|
||||||
"@
|
"@
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -117,13 +117,13 @@ These installers download the bundled `skills/` and `prompts/` trees plus the re
|
|||||||
The one-line installer already targets the latest tagged release. To pin an exact version, pass it explicitly:
|
The one-line installer already targets the latest tagged release. To pin an exact version, pass it explicitly:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl -fsSL https://feynman.is/install | bash -s -- 0.2.20
|
curl -fsSL https://feynman.is/install | bash -s -- 0.2.21
|
||||||
```
|
```
|
||||||
|
|
||||||
On Windows:
|
On Windows:
|
||||||
|
|
||||||
```powershell
|
```powershell
|
||||||
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.20
|
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.21
|
||||||
```
|
```
|
||||||
|
|
||||||
## Post-install setup
|
## Post-install setup
|
||||||
|
|||||||
@@ -35,7 +35,7 @@ When working from existing session context (after a deep research or literature
|
|||||||
|
|
||||||
The writer pays attention to academic conventions: claims are attributed to their sources with inline citations, methodology sections describe procedures precisely, and limitations are discussed honestly. The draft includes placeholder sections for any content the writer cannot generate from available sources, clearly marking what needs human input.
|
The writer pays attention to academic conventions: claims are attributed to their sources with inline citations, methodology sections describe procedures precisely, and limitations are discussed honestly. The draft includes placeholder sections for any content the writer cannot generate from available sources, clearly marking what needs human input.
|
||||||
|
|
||||||
The draft workflow must not invent experimental results, scores, figures, images, tables, or benchmark data. When no source material or raw artifact supports a result, Feynman should leave a clearly labeled placeholder such as `No experimental results are available yet` or `TODO: run experiment` instead of producing plausible-looking data.
|
Drafts follow Feynman's system-wide provenance rules: unsupported results, figures, images, tables, or benchmark data should become clearly labeled gaps or TODOs, not plausible-looking claims.
|
||||||
|
|
||||||
## Output format
|
## Output format
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user