Compare commits
2 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
d30506c82a | ||
|
|
c3f7f6ec08 |
@@ -15,6 +15,8 @@ Operating rules:
|
|||||||
- Never answer a latest/current question from arXiv or alpha-backed paper search alone.
|
- Never answer a latest/current question from arXiv or alpha-backed paper search alone.
|
||||||
- For AI model or product claims, prefer official docs/vendor pages plus recent web sources over old papers.
|
- For AI model or product claims, prefer official docs/vendor pages plus recent web sources over old papers.
|
||||||
- Use the installed Pi research packages for broader web/PDF access, document parsing, citation workflows, background processes, memory, session recall, and delegated subtasks when they reduce friction.
|
- Use the installed Pi research packages for broader web/PDF access, document parsing, citation workflows, background processes, memory, session recall, and delegated subtasks when they reduce friction.
|
||||||
|
- You are running inside the Feynman/Pi runtime with filesystem tools, package tools, and configured extensions. Do not claim you are only a static model, that you cannot write files, or that you cannot use tools unless you attempted the relevant tool and it failed.
|
||||||
|
- If a tool, package, source, or network route is unavailable, record the specific failed capability and still write the requested durable artifact with a clear `Blocked / Unverified` status instead of stopping with chat-only prose.
|
||||||
- Feynman ships project subagents for research work. Prefer the `researcher`, `writer`, `verifier`, and `reviewer` subagents for larger research tasks when decomposition clearly helps.
|
- Feynman ships project subagents for research work. Prefer the `researcher`, `writer`, `verifier`, and `reviewer` subagents for larger research tasks when decomposition clearly helps.
|
||||||
- Use subagents when decomposition meaningfully reduces context pressure or lets you parallelize evidence gathering. For detached long-running work, prefer background subagent execution with `clarify: false, async: true`.
|
- Use subagents when decomposition meaningfully reduces context pressure or lets you parallelize evidence gathering. For detached long-running work, prefer background subagent execution with `clarify: false, async: true`.
|
||||||
- For deep research, act like a lead researcher by default: plan first, use hidden worker batches only when breadth justifies them, synthesize batch results, and finish with a verification pass.
|
- For deep research, act like a lead researcher by default: plan first, use hidden worker batches only when breadth justifies them, synthesize batch results, and finish with a verification pass.
|
||||||
@@ -44,6 +46,7 @@ Operating rules:
|
|||||||
- When citing papers from alpha-backed tools, prefer direct arXiv or alphaXiv links and include the arXiv ID.
|
- When citing papers from alpha-backed tools, prefer direct arXiv or alphaXiv links and include the arXiv ID.
|
||||||
- Default toward delivering a concrete artifact when the task naturally calls for one: reading list, memo, audit, experiment log, or draft.
|
- Default toward delivering a concrete artifact when the task naturally calls for one: reading list, memo, audit, experiment log, or draft.
|
||||||
- For user-facing workflows, produce exactly one canonical durable Markdown artifact unless the user explicitly asks for multiple deliverables.
|
- For user-facing workflows, produce exactly one canonical durable Markdown artifact unless the user explicitly asks for multiple deliverables.
|
||||||
|
- If a workflow requests a durable artifact, verify the file exists on disk before the final response. If complete evidence is unavailable, save a partial artifact that explicitly marks missing checks as `blocked`, `unverified`, or `not run`.
|
||||||
- Do not create extra user-facing intermediate markdown files just because the workflow has multiple reasoning stages.
|
- Do not create extra user-facing intermediate markdown files just because the workflow has multiple reasoning stages.
|
||||||
- Treat HTML/PDF preview outputs as temporary render artifacts, not as the canonical saved result.
|
- Treat HTML/PDF preview outputs as temporary render artifacts, not as the canonical saved result.
|
||||||
- Intermediate task files, raw logs, and verification notes are allowed when they materially reduce context pressure or improve auditability.
|
- Intermediate task files, raw logs, and verification notes are allowed when they materially reduce context pressure or improve auditability.
|
||||||
|
|||||||
@@ -25,7 +25,7 @@ curl -fsSL https://feynman.is/install | bash
|
|||||||
irm https://feynman.is/install.ps1 | iex
|
irm https://feynman.is/install.ps1 | iex
|
||||||
```
|
```
|
||||||
|
|
||||||
The one-line installer fetches the latest tagged release. To pin a version, pass it explicitly, for example `curl -fsSL https://feynman.is/install | bash -s -- 0.2.22`.
|
The one-line installer fetches the latest tagged release. To pin a version, pass it explicitly, for example `curl -fsSL https://feynman.is/install | bash -s -- 0.2.24`.
|
||||||
|
|
||||||
The installer downloads a standalone native bundle with its own Node.js runtime.
|
The installer downloads a standalone native bundle with its own Node.js runtime.
|
||||||
|
|
||||||
|
|||||||
4
package-lock.json
generated
4
package-lock.json
generated
@@ -1,12 +1,12 @@
|
|||||||
{
|
{
|
||||||
"name": "@companion-ai/feynman",
|
"name": "@companion-ai/feynman",
|
||||||
"version": "0.2.22",
|
"version": "0.2.24",
|
||||||
"lockfileVersion": 3,
|
"lockfileVersion": 3,
|
||||||
"requires": true,
|
"requires": true,
|
||||||
"packages": {
|
"packages": {
|
||||||
"": {
|
"": {
|
||||||
"name": "@companion-ai/feynman",
|
"name": "@companion-ai/feynman",
|
||||||
"version": "0.2.22",
|
"version": "0.2.24",
|
||||||
"hasInstallScript": true,
|
"hasInstallScript": true,
|
||||||
"license": "MIT",
|
"license": "MIT",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "@companion-ai/feynman",
|
"name": "@companion-ai/feynman",
|
||||||
"version": "0.2.22",
|
"version": "0.2.24",
|
||||||
"description": "Research-first CLI agent built on Pi and alphaXiv",
|
"description": "Research-first CLI agent built on Pi and alphaXiv",
|
||||||
"license": "MIT",
|
"license": "MIT",
|
||||||
"type": "module",
|
"type": "module",
|
||||||
|
|||||||
@@ -53,6 +53,8 @@ Also save the plan with `memory_remember` (type: `fact`, key: `deepresearch.<slu
|
|||||||
|
|
||||||
Present the plan to the user. If this is an unattended or one-shot run, continue automatically. If the user is actively interacting in the terminal, give them a brief chance to request plan changes before proceeding.
|
Present the plan to the user. If this is an unattended or one-shot run, continue automatically. If the user is actively interacting in the terminal, give them a brief chance to request plan changes before proceeding.
|
||||||
|
|
||||||
|
Do not stop after planning. If live search, subagents, web access, alphaXiv, or any other capability is unavailable, continue in degraded mode and write a durable blocked/partial report that records exactly which capabilities failed.
|
||||||
|
|
||||||
## 2. Scale decision
|
## 2. Scale decision
|
||||||
|
|
||||||
| Query type | Execution |
|
| Query type | Execution |
|
||||||
@@ -105,6 +107,13 @@ When the work spans multiple rounds, also append a concise chronological entry t
|
|||||||
|
|
||||||
Most topics need 1-2 rounds. Stop when additional rounds would not materially change conclusions.
|
Most topics need 1-2 rounds. Stop when additional rounds would not materially change conclusions.
|
||||||
|
|
||||||
|
If no researcher files can be produced because tools, subagents, or network access failed, create `outputs/.drafts/<slug>-draft.md` yourself as a blocked report with:
|
||||||
|
- what was requested,
|
||||||
|
- which capabilities failed,
|
||||||
|
- what evidence was and was not gathered,
|
||||||
|
- a proposed source-gathering plan,
|
||||||
|
- no invented sources or results.
|
||||||
|
|
||||||
## 5. Write the report
|
## 5. Write the report
|
||||||
|
|
||||||
Once evidence is sufficient, YOU write the full research brief directly. Do not delegate writing to another agent. Read the research files, synthesize the findings, and produce a complete document:
|
Once evidence is sufficient, YOU write the full research brief directly. Do not delegate writing to another agent. Read the research files, synthesize the findings, and produce a complete document:
|
||||||
@@ -190,6 +199,7 @@ Before you stop, verify on disk that all of these exist:
|
|||||||
- `outputs/<slug>.provenance.md` or `papers/<slug>.provenance.md` provenance sidecar
|
- `outputs/<slug>.provenance.md` or `papers/<slug>.provenance.md` provenance sidecar
|
||||||
|
|
||||||
Do not stop at `<slug>-brief.md` alone. If the cited brief exists but the promoted final output or provenance sidecar does not, create them before responding.
|
Do not stop at `<slug>-brief.md` alone. If the cited brief exists but the promoted final output or provenance sidecar does not, create them before responding.
|
||||||
|
If full verification could not be completed, still create the final deliverable and provenance sidecar with `Verification: BLOCKED` or `PASS WITH NOTES` and list the missing checks. Never end with only an explanation in chat.
|
||||||
|
|
||||||
## Background execution
|
## Background execution
|
||||||
|
|
||||||
|
|||||||
@@ -110,7 +110,7 @@ This usually means the release exists, but not all platform bundles were uploade
|
|||||||
Workarounds:
|
Workarounds:
|
||||||
- try again after the release finishes publishing
|
- try again after the release finishes publishing
|
||||||
- pass the latest published version explicitly, e.g.:
|
- pass the latest published version explicitly, e.g.:
|
||||||
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.22
|
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.24
|
||||||
"@
|
"@
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -261,7 +261,7 @@ This usually means the release exists, but not all platform bundles were uploade
|
|||||||
Workarounds:
|
Workarounds:
|
||||||
- try again after the release finishes publishing
|
- try again after the release finishes publishing
|
||||||
- pass the latest published version explicitly, e.g.:
|
- pass the latest published version explicitly, e.g.:
|
||||||
curl -fsSL https://feynman.is/install | bash -s -- 0.2.22
|
curl -fsSL https://feynman.is/install | bash -s -- 0.2.24
|
||||||
EOF
|
EOF
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
import { spawnSync } from "node:child_process";
|
import { spawnSync } from "node:child_process";
|
||||||
import { existsSync, lstatSync, mkdirSync, readFileSync, readlinkSync, rmSync, symlinkSync, writeFileSync } from "node:fs";
|
import { existsSync, lstatSync, mkdirSync, readdirSync, readFileSync, readlinkSync, rmSync, symlinkSync, writeFileSync } from "node:fs";
|
||||||
import { createRequire } from "node:module";
|
import { createRequire } from "node:module";
|
||||||
import { homedir } from "node:os";
|
import { homedir } from "node:os";
|
||||||
import { delimiter, dirname, resolve } from "node:path";
|
import { delimiter, dirname, resolve } from "node:path";
|
||||||
@@ -286,28 +286,53 @@ function linkPointsTo(linkPath, targetPath) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function listWorkspacePackageNames(root) {
|
||||||
|
if (!existsSync(root)) return [];
|
||||||
|
const names = [];
|
||||||
|
for (const entry of readdirSync(root, { withFileTypes: true })) {
|
||||||
|
if (!entry.isDirectory() && !entry.isSymbolicLink()) continue;
|
||||||
|
if (entry.name.startsWith(".")) continue;
|
||||||
|
if (entry.name.startsWith("@")) {
|
||||||
|
const scopeRoot = resolve(root, entry.name);
|
||||||
|
for (const scopedEntry of readdirSync(scopeRoot, { withFileTypes: true })) {
|
||||||
|
if (!scopedEntry.isDirectory() && !scopedEntry.isSymbolicLink()) continue;
|
||||||
|
names.push(`${entry.name}/${scopedEntry.name}`);
|
||||||
|
}
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
names.push(entry.name);
|
||||||
|
}
|
||||||
|
return names;
|
||||||
|
}
|
||||||
|
|
||||||
|
function linkBundledPackage(packageName) {
|
||||||
|
const sourcePath = resolve(workspaceRoot, packageName);
|
||||||
|
const targetPath = resolve(globalNodeModulesRoot, packageName);
|
||||||
|
if (!existsSync(sourcePath)) return false;
|
||||||
|
if (linkPointsTo(targetPath, sourcePath)) return false;
|
||||||
|
try {
|
||||||
|
if (lstatSync(targetPath).isSymbolicLink()) {
|
||||||
|
rmSync(targetPath, { force: true });
|
||||||
|
} else if (!installedPackageLooksUsable(targetPath, globalNodeModulesRoot)) {
|
||||||
|
rmSync(targetPath, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
} catch {}
|
||||||
|
if (existsSync(targetPath)) return false;
|
||||||
|
|
||||||
|
ensureParentDir(targetPath);
|
||||||
|
try {
|
||||||
|
symlinkSync(sourcePath, targetPath, process.platform === "win32" ? "junction" : "dir");
|
||||||
|
return true;
|
||||||
|
} catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
function ensureBundledPackageLinks(packageSpecs) {
|
function ensureBundledPackageLinks(packageSpecs) {
|
||||||
if (!workspaceMatchesRuntime(packageSpecs)) return;
|
if (!workspaceMatchesRuntime(packageSpecs)) return;
|
||||||
|
|
||||||
for (const spec of packageSpecs) {
|
for (const packageName of listWorkspacePackageNames(workspaceRoot)) {
|
||||||
const packageName = parsePackageName(spec);
|
linkBundledPackage(packageName);
|
||||||
const sourcePath = resolve(workspaceRoot, packageName);
|
|
||||||
const targetPath = resolve(globalNodeModulesRoot, packageName);
|
|
||||||
if (!existsSync(sourcePath)) continue;
|
|
||||||
if (linkPointsTo(targetPath, sourcePath)) continue;
|
|
||||||
try {
|
|
||||||
if (lstatSync(targetPath).isSymbolicLink()) {
|
|
||||||
rmSync(targetPath, { force: true });
|
|
||||||
} else if (!installedPackageLooksUsable(targetPath, globalNodeModulesRoot)) {
|
|
||||||
rmSync(targetPath, { recursive: true, force: true });
|
|
||||||
}
|
|
||||||
} catch {}
|
|
||||||
if (existsSync(targetPath)) continue;
|
|
||||||
|
|
||||||
ensureParentDir(targetPath);
|
|
||||||
try {
|
|
||||||
symlinkSync(sourcePath, targetPath, process.platform === "win32" ? "junction" : "dir");
|
|
||||||
} catch {}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
import { spawn } from "node:child_process";
|
import { spawn } from "node:child_process";
|
||||||
import { cpSync, existsSync, lstatSync, mkdirSync, readFileSync, readlinkSync, rmSync, symlinkSync, writeFileSync } from "node:fs";
|
import { cpSync, existsSync, lstatSync, mkdirSync, readdirSync, readFileSync, readlinkSync, rmSync, symlinkSync, writeFileSync } from "node:fs";
|
||||||
import { fileURLToPath } from "node:url";
|
import { fileURLToPath } from "node:url";
|
||||||
import { dirname, join, resolve } from "node:path";
|
import { dirname, join, resolve } from "node:path";
|
||||||
|
|
||||||
@@ -427,6 +427,28 @@ function packageNameToPath(root: string, packageName: string): string {
|
|||||||
return resolve(root, packageName);
|
return resolve(root, packageName);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function listBundledWorkspacePackageNames(root: string): string[] {
|
||||||
|
if (!existsSync(root)) {
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
|
||||||
|
const names: string[] = [];
|
||||||
|
for (const entry of readdirSync(root, { withFileTypes: true })) {
|
||||||
|
if (!entry.isDirectory() && !entry.isSymbolicLink()) continue;
|
||||||
|
if (entry.name.startsWith(".")) continue;
|
||||||
|
if (entry.name.startsWith("@")) {
|
||||||
|
const scopeRoot = resolve(root, entry.name);
|
||||||
|
for (const scopedEntry of readdirSync(scopeRoot, { withFileTypes: true })) {
|
||||||
|
if (!scopedEntry.isDirectory() && !scopedEntry.isSymbolicLink()) continue;
|
||||||
|
names.push(`${entry.name}/${scopedEntry.name}`);
|
||||||
|
}
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
names.push(entry.name);
|
||||||
|
}
|
||||||
|
return names;
|
||||||
|
}
|
||||||
|
|
||||||
function packageDependencyExists(packagePath: string, globalNodeModulesRoot: string, dependency: string): boolean {
|
function packageDependencyExists(packagePath: string, globalNodeModulesRoot: string, dependency: string): boolean {
|
||||||
return existsSync(packageNameToPath(resolve(packagePath, "node_modules"), dependency)) ||
|
return existsSync(packageNameToPath(resolve(packagePath, "node_modules"), dependency)) ||
|
||||||
existsSync(packageNameToPath(globalNodeModulesRoot, dependency));
|
existsSync(packageNameToPath(globalNodeModulesRoot, dependency));
|
||||||
@@ -464,6 +486,23 @@ function replaceBrokenPackageWithBundledCopy(targetPath: string, bundledPackageP
|
|||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function seedBundledPackage(globalNodeModulesRoot: string, bundledNodeModulesRoot: string, packageName: string): boolean {
|
||||||
|
const bundledPackagePath = resolve(bundledNodeModulesRoot, packageName);
|
||||||
|
if (!existsSync(bundledPackagePath)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
const targetPath = resolve(globalNodeModulesRoot, packageName);
|
||||||
|
if (replaceBrokenPackageWithBundledCopy(targetPath, bundledPackagePath, globalNodeModulesRoot)) {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
if (!existsSync(targetPath)) {
|
||||||
|
linkDirectory(targetPath, bundledPackagePath);
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
export function seedBundledWorkspacePackages(
|
export function seedBundledWorkspacePackages(
|
||||||
agentDir: string,
|
agentDir: string,
|
||||||
appRoot: string,
|
appRoot: string,
|
||||||
@@ -476,6 +515,10 @@ export function seedBundledWorkspacePackages(
|
|||||||
|
|
||||||
const globalNodeModulesRoot = resolve(getFeynmanNpmPrefixPath(agentDir), "lib", "node_modules");
|
const globalNodeModulesRoot = resolve(getFeynmanNpmPrefixPath(agentDir), "lib", "node_modules");
|
||||||
const seeded: string[] = [];
|
const seeded: string[] = [];
|
||||||
|
const bundledPackageNames = listBundledWorkspacePackageNames(bundledNodeModulesRoot);
|
||||||
|
for (const packageName of bundledPackageNames) {
|
||||||
|
seedBundledPackage(globalNodeModulesRoot, bundledNodeModulesRoot, packageName);
|
||||||
|
}
|
||||||
|
|
||||||
for (const source of sources) {
|
for (const source of sources) {
|
||||||
if (shouldSkipNativeSource(source)) continue;
|
if (shouldSkipNativeSource(source)) continue;
|
||||||
@@ -483,16 +526,8 @@ export function seedBundledWorkspacePackages(
|
|||||||
const parsed = parseNpmSource(source);
|
const parsed = parseNpmSource(source);
|
||||||
if (!parsed) continue;
|
if (!parsed) continue;
|
||||||
|
|
||||||
const bundledPackagePath = resolve(bundledNodeModulesRoot, parsed.name);
|
|
||||||
if (!existsSync(bundledPackagePath)) continue;
|
|
||||||
|
|
||||||
const targetPath = resolve(globalNodeModulesRoot, parsed.name);
|
const targetPath = resolve(globalNodeModulesRoot, parsed.name);
|
||||||
if (replaceBrokenPackageWithBundledCopy(targetPath, bundledPackagePath, globalNodeModulesRoot)) {
|
if (pathsMatchSymlinkTarget(targetPath, resolve(bundledNodeModulesRoot, parsed.name))) {
|
||||||
seeded.push(source);
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
if (!existsSync(targetPath)) {
|
|
||||||
linkDirectory(targetPath, bundledPackagePath);
|
|
||||||
seeded.push(source);
|
seeded.push(source);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -57,3 +57,15 @@ test("research writing prompts forbid fabricated results and unproven figures",
|
|||||||
assert.match(draftPrompt, /placeholder or proposed experimental plan/i);
|
assert.match(draftPrompt, /placeholder or proposed experimental plan/i);
|
||||||
assert.match(draftPrompt, /source-backed quantitative data/i);
|
assert.match(draftPrompt, /source-backed quantitative data/i);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
test("deepresearch workflow requires durable artifacts even when blocked", () => {
|
||||||
|
const systemPrompt = readFileSync(join(repoRoot, ".feynman", "SYSTEM.md"), "utf8");
|
||||||
|
const deepResearchPrompt = readFileSync(join(repoRoot, "prompts", "deepresearch.md"), "utf8");
|
||||||
|
|
||||||
|
assert.match(systemPrompt, /Do not claim you are only a static model/i);
|
||||||
|
assert.match(systemPrompt, /write the requested durable artifact/i);
|
||||||
|
assert.match(deepResearchPrompt, /Do not stop after planning/i);
|
||||||
|
assert.match(deepResearchPrompt, /degraded mode/i);
|
||||||
|
assert.match(deepResearchPrompt, /Verification: BLOCKED/i);
|
||||||
|
assert.match(deepResearchPrompt, /Never end with only an explanation in chat/i);
|
||||||
|
});
|
||||||
|
|||||||
@@ -101,6 +101,7 @@ test("seedBundledWorkspacePackages repairs broken existing bundled packages", ()
|
|||||||
|
|
||||||
assert.deepEqual(seeded, ["npm:pi-markdown-preview"]);
|
assert.deepEqual(seeded, ["npm:pi-markdown-preview"]);
|
||||||
assert.equal(lstatSync(existingPackageDir).isSymbolicLink(), true);
|
assert.equal(lstatSync(existingPackageDir).isSymbolicLink(), true);
|
||||||
|
assert.equal(lstatSync(resolve(homeRoot, "npm-global", "lib", "node_modules", "puppeteer-core")).isSymbolicLink(), true);
|
||||||
assert.equal(
|
assert.equal(
|
||||||
readFileSync(resolve(existingPackageDir, "package.json"), "utf8").includes('"version": "1.0.0"'),
|
readFileSync(resolve(existingPackageDir, "package.json"), "utf8").includes('"version": "1.0.0"'),
|
||||||
true,
|
true,
|
||||||
|
|||||||
@@ -261,7 +261,7 @@ This usually means the release exists, but not all platform bundles were uploade
|
|||||||
Workarounds:
|
Workarounds:
|
||||||
- try again after the release finishes publishing
|
- try again after the release finishes publishing
|
||||||
- pass the latest published version explicitly, e.g.:
|
- pass the latest published version explicitly, e.g.:
|
||||||
curl -fsSL https://feynman.is/install | bash -s -- 0.2.22
|
curl -fsSL https://feynman.is/install | bash -s -- 0.2.24
|
||||||
EOF
|
EOF
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|||||||
@@ -110,7 +110,7 @@ This usually means the release exists, but not all platform bundles were uploade
|
|||||||
Workarounds:
|
Workarounds:
|
||||||
- try again after the release finishes publishing
|
- try again after the release finishes publishing
|
||||||
- pass the latest published version explicitly, e.g.:
|
- pass the latest published version explicitly, e.g.:
|
||||||
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.22
|
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.24
|
||||||
"@
|
"@
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -117,13 +117,13 @@ These installers download the bundled `skills/` and `prompts/` trees plus the re
|
|||||||
The one-line installer already targets the latest tagged release. To pin an exact version, pass it explicitly:
|
The one-line installer already targets the latest tagged release. To pin an exact version, pass it explicitly:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl -fsSL https://feynman.is/install | bash -s -- 0.2.22
|
curl -fsSL https://feynman.is/install | bash -s -- 0.2.24
|
||||||
```
|
```
|
||||||
|
|
||||||
On Windows:
|
On Windows:
|
||||||
|
|
||||||
```powershell
|
```powershell
|
||||||
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.22
|
& ([scriptblock]::Create((irm https://feynman.is/install.ps1))) -Version 0.2.24
|
||||||
```
|
```
|
||||||
|
|
||||||
## Post-install setup
|
## Post-install setup
|
||||||
|
|||||||
Reference in New Issue
Block a user