diff --git a/.planning/ROADMAP.md b/.planning/ROADMAP.md index 2e9c17a..1358639 100644 --- a/.planning/ROADMAP.md +++ b/.planning/ROADMAP.md @@ -139,7 +139,15 @@ Plans: 3. `keyhunter keys list` shows all stored keys masked; `keyhunter keys show ` shows full unmasked detail 4. `keyhunter keys export --format=json` produces a JSON file with full key values; `--format=csv` produces a CSV 5. `keyhunter keys copy ` copies the full key to clipboard; `keyhunter keys delete ` removes the key from the database -**Plans**: TBD +**Plans**: 6 plans + +Plans: +- [ ] 06-01-PLAN.md — Wave 0: Formatter interface, colors.go (TTY/NO_COLOR), refactor TableFormatter +- [ ] 06-02-PLAN.md — JSONFormatter + CSVFormatter (full Finding fields, Unmask option) +- [ ] 06-03-PLAN.md — SARIF 2.1.0 formatter with custom structs (rule dedup, level mapping) +- [ ] 06-04-PLAN.md — pkg/storage/queries.go: Filters, ListFindingsFiltered, GetFinding, DeleteFinding +- [ ] 06-05-PLAN.md — cmd/keys.go command tree: list/show/export/copy/delete/verify (KEYS-01..06) +- [ ] 06-06-PLAN.md — scan --output registry dispatch + exit codes 0/1/2 (OUT-05, OUT-06) ### Phase 7: Import Adapters & CI/CD Integration **Goal**: Users can import findings from TruffleHog and Gitleaks into KeyHunter's database, and use KeyHunter in pre-commit hooks and CI/CD pipelines with SARIF output uploadable to GitHub Security diff --git a/.planning/phases/06-output-reporting/06-01-PLAN.md b/.planning/phases/06-output-reporting/06-01-PLAN.md new file mode 100644 index 0000000..db4c095 --- /dev/null +++ b/.planning/phases/06-output-reporting/06-01-PLAN.md @@ -0,0 +1,441 @@ +--- +phase: 06-output-reporting +plan: 01 +type: execute +wave: 0 +depends_on: [] +files_modified: + - pkg/output/formatter.go + - pkg/output/colors.go + - pkg/output/table.go + - pkg/output/table_test.go + - pkg/output/colors_test.go + - pkg/output/formatter_test.go + - go.mod +autonomous: true +requirements: [OUT-01, OUT-06] + +must_haves: + truths: + - "pkg/output exposes a Formatter interface all formats implement" + - "TableFormatter renders findings with colors only on TTY" + - "Non-TTY stdout produces no ANSI escape sequences" + - "An output.Registry maps format names to Formatter implementations" + artifacts: + - path: pkg/output/formatter.go + provides: "Formatter interface, Registry, Options struct" + exports: ["Formatter", "Options", "Register", "Get", "ErrUnknownFormat"] + - path: pkg/output/colors.go + provides: "TTY detection + profile selection" + exports: ["IsTTY", "ColorsEnabled"] + - path: pkg/output/table.go + provides: "Refactored TableFormatter implementing Formatter" + contains: "type TableFormatter struct" + key_links: + - from: pkg/output/table.go + to: pkg/output/formatter.go + via: "TableFormatter implements Formatter.Format(findings, w, opts)" + pattern: "func \\(.*TableFormatter\\) Format" + - from: pkg/output/table.go + to: pkg/output/colors.go + via: "Strips lipgloss colors when ColorsEnabled(w) is false" + pattern: "ColorsEnabled" +--- + + +Establish the Formatter interface and refactor the existing table output to implement it. Add TTY detection so colored output is only emitted to real terminals, and introduce a Registry so scan.go (Plan 06) can select formatters by name. This is the foundation all other formatter plans build on. + +Purpose: Unify output paths under one interface so JSON/SARIF/CSV formatters (Plans 02-03) can be added in parallel. +Output: `pkg/output/formatter.go`, `pkg/output/colors.go`, refactored `pkg/output/table.go`, unit tests. + + + +@$HOME/.claude/get-shit-done/workflows/execute-plan.md +@$HOME/.claude/get-shit-done/templates/summary.md + + + +@.planning/PROJECT.md +@.planning/ROADMAP.md +@.planning/STATE.md +@.planning/phases/06-output-reporting/06-CONTEXT.md +@pkg/output/table.go +@pkg/engine/finding.go + + +From pkg/engine/finding.go: +```go +type Finding struct { + ProviderName string + KeyValue string + KeyMasked string + Confidence string // "high"|"medium"|"low" + Source string + SourceType string + LineNumber int + Offset int64 + DetectedAt time.Time + Verified bool + VerifyStatus string + VerifyHTTPCode int + VerifyMetadata map[string]string + VerifyError string +} +func MaskKey(key string) string +``` + +isatty is ALREADY available as an indirect dep in go.mod (github.com/mattn/go-isatty v0.0.20 via lipgloss). This plan promotes it to a direct dep via `go get`. + + + + + + + Task 1: Create Formatter interface, Options, Registry, and colors helper + pkg/output/formatter.go, pkg/output/colors.go, pkg/output/formatter_test.go, pkg/output/colors_test.go, go.mod + + - pkg/output/table.go (current API, styles) + - pkg/engine/finding.go (Finding struct) + - go.mod (isatty already indirect) + + + - Formatter interface: Format(findings []engine.Finding, w io.Writer, opts Options) error + - Options: { Unmask bool, ToolName string, ToolVersion string } + - Registry: Register(name string, f Formatter), Get(name string) (Formatter, error). ErrUnknownFormat sentinel. + - ColorsEnabled(w io.Writer) bool: returns true only if w is *os.File pointing at a TTY AND NO_COLOR env var is unset. + - IsTTY(f *os.File) bool: wraps isatty.IsTerminal(f.Fd()). + - Test: Register + Get round-trip; Get("nope") returns ErrUnknownFormat. + - Test: ColorsEnabled on bytes.Buffer returns false. + - Test: ColorsEnabled with NO_COLOR=1 returns false even if TTY would be true (use t.Setenv). + + + 1. Promote isatty to a direct dependency: `go get github.com/mattn/go-isatty@v0.0.20` (already resolved, this just moves it from indirect to direct in go.mod). + + 2. Create pkg/output/formatter.go: + ```go + package output + + import ( + "errors" + "fmt" + "io" + + "github.com/salvacybersec/keyhunter/pkg/engine" + ) + + // ErrUnknownFormat is returned by Get when no formatter is registered for the given name. + var ErrUnknownFormat = errors.New("output: unknown format") + + // Options controls formatter behavior. Unmask reveals full key values. + // ToolName/ToolVersion are used by SARIF and similar metadata-bearing formats. + type Options struct { + Unmask bool + ToolName string + ToolVersion string + } + + // Formatter renders a slice of findings to an io.Writer. + // Implementations must not mutate findings. + type Formatter interface { + Format(findings []engine.Finding, w io.Writer, opts Options) error + } + + var registry = map[string]Formatter{} + + // Register adds a formatter under the given name. Safe to call from init(). + func Register(name string, f Formatter) { + registry[name] = f + } + + // Get returns the formatter registered under name, or ErrUnknownFormat. + func Get(name string) (Formatter, error) { + f, ok := registry[name] + if !ok { + return nil, fmt.Errorf("%w: %q", ErrUnknownFormat, name) + } + return f, nil + } + + // Names returns the sorted list of registered format names (for --help and errors). + func Names() []string { + names := make([]string, 0, len(registry)) + for k := range registry { + names = append(names, k) + } + // Sort to stabilize help output. + for i := 1; i < len(names); i++ { + for j := i; j > 0 && names[j-1] > names[j]; j-- { + names[j-1], names[j] = names[j], names[j-1] + } + } + return names + } + ``` + + 3. Create pkg/output/colors.go: + ```go + package output + + import ( + "io" + "os" + + "github.com/mattn/go-isatty" + ) + + // IsTTY reports whether f is an open terminal. + func IsTTY(f *os.File) bool { + if f == nil { + return false + } + return isatty.IsTerminal(f.Fd()) || isatty.IsCygwinTerminal(f.Fd()) + } + + // ColorsEnabled reports whether ANSI color output should be emitted on w. + // Returns false when w is not an *os.File, when it is not a TTY, or when + // the NO_COLOR environment variable is set (https://no-color.org/). + func ColorsEnabled(w io.Writer) bool { + if _, ok := os.LookupEnv("NO_COLOR"); ok { + return false + } + f, ok := w.(*os.File) + if !ok { + return false + } + return IsTTY(f) + } + ``` + + 4. Create pkg/output/formatter_test.go with table-driven tests covering Register/Get/ErrUnknownFormat/Names. + + 5. Create pkg/output/colors_test.go: ColorsEnabled(&bytes.Buffer{})==false; NO_COLOR via t.Setenv forces false. + + + cd /home/salva/Documents/apikey && go test ./pkg/output/... -run "TestFormatter|TestColors" -count=1 + + + - `go build ./...` succeeds + - `go test ./pkg/output/... -run "TestFormatter|TestColors"` passes + - `grep -q "github.com/mattn/go-isatty" go.mod` confirms direct dep + - `grep -q "type Formatter interface" pkg/output/formatter.go` + - `grep -q "ErrUnknownFormat" pkg/output/formatter.go` + + + + + Task 2: Refactor table.go into TableFormatter, strip colors for non-TTY, register under "table" + pkg/output/table.go, pkg/output/table_test.go + + - pkg/output/formatter.go (from Task 1) + - pkg/output/colors.go (from Task 1) + - pkg/output/table.go (current impl) + + + - TableFormatter{} implements Formatter. + - Writes to the provided io.Writer (not os.Stdout). + - When ColorsEnabled(w)==false, no ANSI escape sequences appear in output (strip by using lipgloss.SetColorProfile or by constructing plain styles). + - Preserves existing layout: PROVIDER/KEY/CONFIDENCE/SOURCE/LINE columns; VERIFY column when any finding is verified; indented metadata line. + - Empty slice -> "No API keys found.\n". + - Non-empty -> footer "\n{N} key(s) found.\n". + - Respects opts.Unmask: Unmask=true uses KeyValue, false uses KeyMasked. + - Keeps PrintFindings(findings, unmask) as a thin backward-compatible wrapper that delegates to TableFormatter.Format(findings, os.Stdout, Options{Unmask: unmask}) — existing scan.go calls still compile until Plan 06. + - Tests assert: (a) empty case string equality; (b) verified+unverified columns; (c) NO_COLOR=1 output contains no "\x1b["; (d) metadata line is sorted. + + + Rewrite pkg/output/table.go: + + ```go + package output + + import ( + "fmt" + "io" + "os" + "sort" + "strings" + + "github.com/charmbracelet/lipgloss" + "github.com/salvacybersec/keyhunter/pkg/engine" + ) + + func init() { + Register("table", TableFormatter{}) + } + + // TableFormatter renders findings as a colored terminal table. + // Colors are automatically stripped when the writer is not a TTY or + // when NO_COLOR is set. + type TableFormatter struct{} + + func (TableFormatter) Format(findings []engine.Finding, w io.Writer, opts Options) error { + if len(findings) == 0 { + _, err := fmt.Fprintln(w, "No API keys found.") + return err + } + + colored := ColorsEnabled(w) + style := newTableStyles(colored) + + anyVerified := false + for _, f := range findings { + if f.Verified { + anyVerified = true + break + } + } + + if anyVerified { + fmt.Fprintf(w, "%-20s %-40s %-10s %-30s %-5s %s\n", + style.header.Render("PROVIDER"), + style.header.Render("KEY"), + style.header.Render("CONFIDENCE"), + style.header.Render("SOURCE"), + style.header.Render("LINE"), + style.header.Render("VERIFY"), + ) + } else { + fmt.Fprintf(w, "%-20s %-40s %-10s %-30s %s\n", + style.header.Render("PROVIDER"), + style.header.Render("KEY"), + style.header.Render("CONFIDENCE"), + style.header.Render("SOURCE"), + style.header.Render("LINE"), + ) + } + fmt.Fprintln(w, style.divider.Render(strings.Repeat("─", 106))) + + for _, f := range findings { + keyDisplay := f.KeyMasked + if opts.Unmask { + keyDisplay = f.KeyValue + } + confStyle := style.low + switch f.Confidence { + case "high": + confStyle = style.high + case "medium": + confStyle = style.medium + } + if anyVerified { + fmt.Fprintf(w, "%-20s %-40s %-10s %-30s %-5d %s\n", + f.ProviderName, keyDisplay, confStyle.Render(f.Confidence), + truncate(f.Source, 28), f.LineNumber, verifySymbolStyled(f, style), + ) + } else { + fmt.Fprintf(w, "%-20s %-40s %-10s %-30s %d\n", + f.ProviderName, keyDisplay, confStyle.Render(f.Confidence), + truncate(f.Source, 28), f.LineNumber, + ) + } + if len(f.VerifyMetadata) > 0 { + parts := make([]string, 0, len(f.VerifyMetadata)) + for k, v := range f.VerifyMetadata { + parts = append(parts, fmt.Sprintf("%s: %s", k, v)) + } + sort.Strings(parts) + fmt.Fprintf(w, " ↳ %s\n", strings.Join(parts, ", ")) + } + } + fmt.Fprintf(w, "\n%d key(s) found.\n", len(findings)) + return nil + } + + type tableStyles struct { + header, divider, high, medium, low lipgloss.Style + verifyLive, verifyDead, verifyRate, verifyErr, verifyUnk lipgloss.Style + } + + func newTableStyles(colored bool) tableStyles { + if !colored { + plain := lipgloss.NewStyle() + return tableStyles{ + header: plain, divider: plain, high: plain, medium: plain, low: plain, + verifyLive: plain, verifyDead: plain, verifyRate: plain, verifyErr: plain, verifyUnk: plain, + } + } + return tableStyles{ + header: lipgloss.NewStyle().Bold(true).Underline(true), + divider: lipgloss.NewStyle().Foreground(lipgloss.Color("8")), + high: lipgloss.NewStyle().Foreground(lipgloss.Color("2")), + medium: lipgloss.NewStyle().Foreground(lipgloss.Color("3")), + low: lipgloss.NewStyle().Foreground(lipgloss.Color("1")), + verifyLive: lipgloss.NewStyle().Foreground(lipgloss.Color("2")), + verifyDead: lipgloss.NewStyle().Foreground(lipgloss.Color("1")), + verifyRate: lipgloss.NewStyle().Foreground(lipgloss.Color("3")), + verifyErr: lipgloss.NewStyle().Foreground(lipgloss.Color("1")), + verifyUnk: lipgloss.NewStyle().Foreground(lipgloss.Color("8")), + } + } + + func verifySymbolStyled(f engine.Finding, s tableStyles) string { + if !f.Verified { + return "" + } + switch f.VerifyStatus { + case "live": + return s.verifyLive.Render("✓ live") + case "dead": + return s.verifyDead.Render("✗ dead") + case "rate_limited": + return s.verifyRate.Render("⚠ rate") + case "error": + return s.verifyErr.Render("! err") + default: + return s.verifyUnk.Render("? unk") + } + } + + // PrintFindings is a backward-compatible wrapper for existing callers. + // Deprecated: use TableFormatter.Format directly. + func PrintFindings(findings []engine.Finding, unmask bool) { + _ = TableFormatter{}.Format(findings, os.Stdout, Options{Unmask: unmask}) + } + + func truncate(s string, max int) string { + if len(s) <= max { + return s + } + return "..." + s[len(s)-max+3:] + } + ``` + + Create pkg/output/table_test.go: + - TestTableFormatter_Empty: asserts exact string "No API keys found.\n" to bytes.Buffer. + - TestTableFormatter_NoColorInBuffer: two findings, one verified; asserts output does NOT contain "\x1b[" (bytes.Buffer is not a TTY). + - TestTableFormatter_UnverifiedLayout: asserts header line does not contain "VERIFY". + - TestTableFormatter_VerifiedLayout: asserts header includes "VERIFY" and row contains "live" when VerifyStatus=="live". + - TestTableFormatter_Masking: Unmask=false renders f.KeyMasked; Unmask=true renders f.KeyValue. + - TestTableFormatter_MetadataSorted: VerifyMetadata={"z":"1","a":"2"} renders "a: 2, z: 1". + + + cd /home/salva/Documents/apikey && go test ./pkg/output/... -run "TestTableFormatter" -count=1 && go build ./... + + + - All TestTableFormatter_* tests pass + - `go build ./...` succeeds (scan.go still uses PrintFindings wrapper) + - `grep -q "Register(\"table\"" pkg/output/table.go` + - `grep -q "TableFormatter" pkg/output/table.go` + - Test output confirms no "\x1b[" when writing to bytes.Buffer + + + + + + +- `go build ./...` succeeds +- `go test ./pkg/output/... -count=1` all green +- `grep -q "ErrUnknownFormat\|type Formatter interface\|TableFormatter" pkg/output/*.go` +- isatty is a direct dependency: `grep -E "^\tgithub.com/mattn/go-isatty" go.mod` (not in indirect block) + + + +- Formatter interface + Registry + Options exist and are tested +- TableFormatter implements Formatter, registered as "table" +- Colors stripped when writer is not a TTY or NO_COLOR set +- Existing PrintFindings wrapper keeps scan.go compiling +- Foundation ready for JSON/SARIF/CSV formatters in Wave 1 + + + +After completion, create `.planning/phases/06-output-reporting/06-01-SUMMARY.md`. + diff --git a/.planning/phases/06-output-reporting/06-02-PLAN.md b/.planning/phases/06-output-reporting/06-02-PLAN.md new file mode 100644 index 0000000..db8dadd --- /dev/null +++ b/.planning/phases/06-output-reporting/06-02-PLAN.md @@ -0,0 +1,275 @@ +--- +phase: 06-output-reporting +plan: 02 +type: execute +wave: 1 +depends_on: [06-01] +files_modified: + - pkg/output/json.go + - pkg/output/csv.go + - pkg/output/json_test.go + - pkg/output/csv_test.go +autonomous: true +requirements: [OUT-02, OUT-04] + +must_haves: + truths: + - "scan results can be rendered as well-formed JSON (one array of finding objects)" + - "scan results can be rendered as CSV with a stable header row" + - "Both formatters honor the Unmask option for KeyValue exposure" + - "Both formatters are registered in output.Registry under 'json' and 'csv'" + artifacts: + - path: pkg/output/json.go + provides: "JSONFormatter implementing Formatter" + contains: "type JSONFormatter struct" + - path: pkg/output/csv.go + provides: "CSVFormatter implementing Formatter" + contains: "type CSVFormatter struct" + key_links: + - from: pkg/output/json.go + to: pkg/output/formatter.go + via: "init() Register(\"json\", JSONFormatter{})" + pattern: "Register\\(\"json\"" + - from: pkg/output/csv.go + to: pkg/output/formatter.go + via: "init() Register(\"csv\", CSVFormatter{})" + pattern: "Register\\(\"csv\"" +--- + + +Implement JSONFormatter (full Finding serialization) and CSVFormatter (header row + flat rows) so `keyhunter scan --output=json` and `--output=csv` work end to end after Plan 06 wires the scan command. + +Purpose: Machine-readable outputs for pipelines and spreadsheets. Addresses OUT-02 and OUT-04. +Output: `pkg/output/json.go`, `pkg/output/csv.go`, tests. + + + +@$HOME/.claude/get-shit-done/workflows/execute-plan.md +@$HOME/.claude/get-shit-done/templates/summary.md + + + +@.planning/phases/06-output-reporting/06-CONTEXT.md +@.planning/phases/06-output-reporting/06-01-PLAN.md +@pkg/engine/finding.go + + +From Plan 06-01 (pkg/output/formatter.go): +```go +type Formatter interface { + Format(findings []engine.Finding, w io.Writer, opts Options) error +} +type Options struct { + Unmask bool + ToolName string + ToolVersion string +} +func Register(name string, f Formatter) +``` +From pkg/engine/finding.go: full Finding struct with Verify* fields. + + + + + + + Task 1: JSONFormatter with full finding fields + pkg/output/json.go, pkg/output/json_test.go + + - pkg/output/formatter.go (Formatter interface, Options) + - pkg/engine/finding.go + + + - Output is a JSON array: `[{...}, {...}]` with 2-space indent. + - Each element includes: provider, key (full when Unmask, masked when not), key_masked, confidence, source, source_type, line, offset, detected_at (RFC3339), verified, verify_status, verify_http_code, verify_metadata, verify_error. + - Empty findings slice -> `[]\n`. + - Uses encoding/json Encoder with SetIndent("", " "). + - Tests: (a) empty slice -> "[]\n"; (b) one finding round-trips through json.Unmarshal with key==KeyMasked when Unmask=false; (c) Unmask=true sets key==KeyValue; (d) verify fields present when Verified=true. + + + Create pkg/output/json.go: + + ```go + package output + + import ( + "encoding/json" + "io" + "time" + + "github.com/salvacybersec/keyhunter/pkg/engine" + ) + + func init() { + Register("json", JSONFormatter{}) + } + + // JSONFormatter renders findings as a JSON array with 2-space indent. + type JSONFormatter struct{} + + type jsonFinding struct { + Provider string `json:"provider"` + Key string `json:"key"` + KeyMasked string `json:"key_masked"` + Confidence string `json:"confidence"` + Source string `json:"source"` + SourceType string `json:"source_type"` + Line int `json:"line"` + Offset int64 `json:"offset"` + DetectedAt string `json:"detected_at"` + Verified bool `json:"verified"` + VerifyStatus string `json:"verify_status,omitempty"` + VerifyHTTPCode int `json:"verify_http_code,omitempty"` + VerifyMetadata map[string]string `json:"verify_metadata,omitempty"` + VerifyError string `json:"verify_error,omitempty"` + } + + func (JSONFormatter) Format(findings []engine.Finding, w io.Writer, opts Options) error { + out := make([]jsonFinding, 0, len(findings)) + for _, f := range findings { + key := f.KeyMasked + if opts.Unmask { + key = f.KeyValue + } + out = append(out, jsonFinding{ + Provider: f.ProviderName, + Key: key, + KeyMasked: f.KeyMasked, + Confidence: f.Confidence, + Source: f.Source, + SourceType: f.SourceType, + Line: f.LineNumber, + Offset: f.Offset, + DetectedAt: f.DetectedAt.Format(time.RFC3339), + Verified: f.Verified, + VerifyStatus: f.VerifyStatus, + VerifyHTTPCode: f.VerifyHTTPCode, + VerifyMetadata: f.VerifyMetadata, + VerifyError: f.VerifyError, + }) + } + enc := json.NewEncoder(w) + enc.SetIndent("", " ") + return enc.Encode(out) + } + ``` + + Create pkg/output/json_test.go with tests for empty, masked, unmask, verify fields. Use json.Unmarshal to assert field values. + + + cd /home/salva/Documents/apikey && go test ./pkg/output/... -run "TestJSONFormatter" -count=1 + + + - TestJSONFormatter_* all pass + - `grep -q "Register(\"json\"" pkg/output/json.go` + - `go build ./...` succeeds + + + + + Task 2: CSVFormatter with stable header row + pkg/output/csv.go, pkg/output/csv_test.go + + - pkg/output/formatter.go + - pkg/engine/finding.go + + + - Header: id,provider,confidence,key,source,line,detected_at,verified,verify_status + - id is the zero-based index within the findings slice (scan-time id; DB id not available here). + - key column renders KeyMasked when Unmask=false, KeyValue when Unmask=true. + - verified column is "true"/"false". + - Uses encoding/csv Writer; flushes on return. + - Empty findings still writes header row only. + - Tests: header presence; masked vs unmask; CSV quoting of comma in Source; verify_status column populated. + + + Create pkg/output/csv.go: + + ```go + package output + + import ( + "encoding/csv" + "io" + "strconv" + "time" + + "github.com/salvacybersec/keyhunter/pkg/engine" + ) + + func init() { + Register("csv", CSVFormatter{}) + } + + // CSVFormatter renders findings as comma-separated values with a fixed header row. + type CSVFormatter struct{} + + var csvHeader = []string{ + "id", "provider", "confidence", "key", "source", + "line", "detected_at", "verified", "verify_status", + } + + func (CSVFormatter) Format(findings []engine.Finding, w io.Writer, opts Options) error { + cw := csv.NewWriter(w) + if err := cw.Write(csvHeader); err != nil { + return err + } + for i, f := range findings { + key := f.KeyMasked + if opts.Unmask { + key = f.KeyValue + } + row := []string{ + strconv.Itoa(i), + f.ProviderName, + f.Confidence, + key, + f.Source, + strconv.Itoa(f.LineNumber), + f.DetectedAt.Format(time.RFC3339), + strconv.FormatBool(f.Verified), + f.VerifyStatus, + } + if err := cw.Write(row); err != nil { + return err + } + } + cw.Flush() + return cw.Error() + } + ``` + + Create pkg/output/csv_test.go: + - TestCSVFormatter_HeaderOnly: empty findings -> single header line. + - TestCSVFormatter_Row: one finding, parse with csv.NewReader, assert fields. + - TestCSVFormatter_QuotesCommaInSource: Source="a, b.txt" round-trips via csv reader. + - TestCSVFormatter_Unmask: Unmask=true puts KeyValue into key column. + + + cd /home/salva/Documents/apikey && go test ./pkg/output/... -run "TestCSVFormatter" -count=1 + + + - TestCSVFormatter_* all pass + - `grep -q "Register(\"csv\"" pkg/output/csv.go` + - Header exactly matches: `grep -q 'id", "provider", "confidence", "key", "source"' pkg/output/csv.go` + - `go build ./...` succeeds + + + + + + +- `go test ./pkg/output/... -count=1` all green +- Both formats registered: `grep -h "Register(" pkg/output/*.go` shows table, json, csv (sarif added in Plan 03) + + + +- JSONFormatter and CSVFormatter implement Formatter +- Both are registered on package init +- Unmask option propagates to key column +- All unit tests pass + + + +After completion, create `.planning/phases/06-output-reporting/06-02-SUMMARY.md`. + diff --git a/.planning/phases/06-output-reporting/06-03-PLAN.md b/.planning/phases/06-output-reporting/06-03-PLAN.md new file mode 100644 index 0000000..17ba001 --- /dev/null +++ b/.planning/phases/06-output-reporting/06-03-PLAN.md @@ -0,0 +1,302 @@ +--- +phase: 06-output-reporting +plan: 03 +type: execute +wave: 1 +depends_on: [06-01] +files_modified: + - pkg/output/sarif.go + - pkg/output/sarif_test.go +autonomous: true +requirements: [OUT-03] + +must_haves: + truths: + - "scan results can be rendered as SARIF 2.1.0 JSON suitable for GitHub Security upload" + - "one SARIF rule is emitted per distinct provider observed in the findings" + - "each result maps confidence to SARIF level (high=error, medium=warning, low=note)" + - "each result has a physicalLocation with artifactLocation.uri and region.startLine" + artifacts: + - path: pkg/output/sarif.go + provides: "SARIFFormatter + SARIF 2.1.0 structs" + contains: "SARIFFormatter" + key_links: + - from: pkg/output/sarif.go + to: pkg/output/formatter.go + via: "init() Register(\"sarif\", SARIFFormatter{})" + pattern: "Register\\(\"sarif\"" + - from: pkg/output/sarif.go + to: "SARIF 2.1.0 schema" + via: "$schema + version fields" + pattern: "2.1.0" +--- + + +Implement a SARIF 2.1.0 formatter using hand-rolled structs (CLAUDE.md constraint: no SARIF library). Emits a schema-valid report that GitHub's code scanning accepts on upload. + +Purpose: CI/CD integration (CICD-02 downstream in Phase 7 depends on this). Addresses OUT-03. +Output: `pkg/output/sarif.go`, `pkg/output/sarif_test.go`. + + + +@$HOME/.claude/get-shit-done/workflows/execute-plan.md +@$HOME/.claude/get-shit-done/templates/summary.md + + + +@.planning/phases/06-output-reporting/06-CONTEXT.md +@.planning/phases/06-output-reporting/06-01-PLAN.md +@pkg/engine/finding.go + + +From Plan 06-01: +```go +type Formatter interface { + Format(findings []engine.Finding, w io.Writer, opts Options) error +} +type Options struct { + Unmask bool + ToolName string // "keyhunter" + ToolVersion string // e.g. "0.6.0" +} +``` + +SARIF 2.1.0 reference minimal shape: +```json +{ + "$schema": "https://json.schemastore.org/sarif-2.1.0.json", + "version": "2.1.0", + "runs": [{ + "tool": { "driver": { "name": "...", "version": "...", "rules": [{"id":"...","name":"...","shortDescription":{"text":"..."}}] } }, + "results": [{ + "ruleId": "...", + "level": "error|warning|note", + "message": { "text": "..." }, + "locations": [{ + "physicalLocation": { + "artifactLocation": { "uri": "..." }, + "region": { "startLine": 1 } + } + }] + }] + }] +} +``` + + + + + + + Task 1: SARIF 2.1.0 struct definitions + SARIFFormatter + pkg/output/sarif.go, pkg/output/sarif_test.go + + - pkg/output/formatter.go + - pkg/output/json.go (for consistent encoding style) + - pkg/engine/finding.go + + + - SARIFFormatter implements Formatter. + - Output JSON contains top-level $schema="https://json.schemastore.org/sarif-2.1.0.json" and version="2.1.0". + - runs[0].tool.driver.name = opts.ToolName (fallback "keyhunter"), version = opts.ToolVersion (fallback "dev"). + - rules are deduped by provider name; rule.id == provider name; rule.name == provider name; rule.shortDescription.text = "Leaked API key". + - results: one per finding. ruleId = providerName. level: high->"error", medium->"warning", low->"note", default "warning". + - message.text: "Detected key (): " where key is masked unless opts.Unmask. + - locations[0].physicalLocation.artifactLocation.uri = f.Source (unchanged path). + - locations[0].physicalLocation.region.startLine = max(1, f.LineNumber) (SARIF requires >= 1). + - Empty findings: still emit a valid SARIF doc with empty rules and empty results. + - Tests: + * TestSARIF_Empty: parse output, assert version=="2.1.0", len(runs)==1, len(results)==0, len(rules)==0. + * TestSARIF_DedupRules: two findings same provider -> len(rules)==1. + * TestSARIF_LevelMapping: high/medium/low -> error/warning/note. + * TestSARIF_LineFloor: f.LineNumber=0 -> region.startLine==1. + * TestSARIF_Masking: opts.Unmask=false -> message.text contains KeyMasked, not KeyValue. + * TestSARIF_ToolVersionFallback: empty Options uses "keyhunter"/"dev". + + + Create pkg/output/sarif.go: + + ```go + package output + + import ( + "encoding/json" + "fmt" + "io" + + "github.com/salvacybersec/keyhunter/pkg/engine" + ) + + func init() { + Register("sarif", SARIFFormatter{}) + } + + // SARIFFormatter emits SARIF 2.1.0 JSON suitable for CI uploads. + type SARIFFormatter struct{} + + type sarifDoc struct { + Schema string `json:"$schema"` + Version string `json:"version"` + Runs []sarifRun `json:"runs"` + } + + type sarifRun struct { + Tool sarifTool `json:"tool"` + Results []sarifResult `json:"results"` + } + + type sarifTool struct { + Driver sarifDriver `json:"driver"` + } + + type sarifDriver struct { + Name string `json:"name"` + Version string `json:"version"` + Rules []sarifRule `json:"rules"` + } + + type sarifRule struct { + ID string `json:"id"` + Name string `json:"name"` + ShortDescription sarifText `json:"shortDescription"` + } + + type sarifText struct { + Text string `json:"text"` + } + + type sarifResult struct { + RuleID string `json:"ruleId"` + Level string `json:"level"` + Message sarifText `json:"message"` + Locations []sarifLocation `json:"locations"` + } + + type sarifLocation struct { + PhysicalLocation sarifPhysicalLocation `json:"physicalLocation"` + } + + type sarifPhysicalLocation struct { + ArtifactLocation sarifArtifactLocation `json:"artifactLocation"` + Region sarifRegion `json:"region"` + } + + type sarifArtifactLocation struct { + URI string `json:"uri"` + } + + type sarifRegion struct { + StartLine int `json:"startLine"` + } + + func (SARIFFormatter) Format(findings []engine.Finding, w io.Writer, opts Options) error { + toolName := opts.ToolName + if toolName == "" { + toolName = "keyhunter" + } + toolVersion := opts.ToolVersion + if toolVersion == "" { + toolVersion = "dev" + } + + // Dedup rules by provider, preserving first-seen order. + seen := map[string]bool{} + rules := make([]sarifRule, 0) + for _, f := range findings { + if seen[f.ProviderName] { + continue + } + seen[f.ProviderName] = true + rules = append(rules, sarifRule{ + ID: f.ProviderName, + Name: f.ProviderName, + ShortDescription: sarifText{Text: fmt.Sprintf("Leaked %s API key", f.ProviderName)}, + }) + } + + results := make([]sarifResult, 0, len(findings)) + for _, f := range findings { + key := f.KeyMasked + if opts.Unmask { + key = f.KeyValue + } + startLine := f.LineNumber + if startLine < 1 { + startLine = 1 + } + results = append(results, sarifResult{ + RuleID: f.ProviderName, + Level: sarifLevel(f.Confidence), + Message: sarifText{Text: fmt.Sprintf("Detected %s key (%s): %s", f.ProviderName, f.Confidence, key)}, + Locations: []sarifLocation{{ + PhysicalLocation: sarifPhysicalLocation{ + ArtifactLocation: sarifArtifactLocation{URI: f.Source}, + Region: sarifRegion{StartLine: startLine}, + }, + }}, + }) + } + + doc := sarifDoc{ + Schema: "https://json.schemastore.org/sarif-2.1.0.json", + Version: "2.1.0", + Runs: []sarifRun{{ + Tool: sarifTool{Driver: sarifDriver{ + Name: toolName, + Version: toolVersion, + Rules: rules, + }}, + Results: results, + }}, + } + + enc := json.NewEncoder(w) + enc.SetIndent("", " ") + return enc.Encode(doc) + } + + func sarifLevel(confidence string) string { + switch confidence { + case "high": + return "error" + case "medium": + return "warning" + case "low": + return "note" + default: + return "warning" + } + } + ``` + + Create pkg/output/sarif_test.go implementing all six test cases listed in . Use json.Unmarshal into sarifDoc to assert structural fields. + + + cd /home/salva/Documents/apikey && go test ./pkg/output/... -run "TestSARIF" -count=1 + + + - All TestSARIF_* tests pass + - `grep -q "Register(\"sarif\"" pkg/output/sarif.go` + - `grep -q '"2.1.0"' pkg/output/sarif.go` + - `grep -q "sarifLevel" pkg/output/sarif.go` and covers high/medium/low + - `go build ./...` succeeds + + + + + + +- `go test ./pkg/output/... -count=1` all green +- All four formatters (table, json, csv, sarif) are registered + + + +- SARIFFormatter produces 2.1.0-compliant documents +- Rules deduped per provider +- Confidence -> level mapping is deterministic +- Ready for CI/CD integration in Phase 7 + + + +After completion, create `.planning/phases/06-output-reporting/06-03-SUMMARY.md`. + diff --git a/.planning/phases/06-output-reporting/06-04-PLAN.md b/.planning/phases/06-output-reporting/06-04-PLAN.md new file mode 100644 index 0000000..0dcdd72 --- /dev/null +++ b/.planning/phases/06-output-reporting/06-04-PLAN.md @@ -0,0 +1,287 @@ +--- +phase: 06-output-reporting +plan: 04 +type: execute +wave: 1 +depends_on: [] +files_modified: + - pkg/storage/queries.go + - pkg/storage/queries_test.go +autonomous: true +requirements: [KEYS-01, KEYS-02, KEYS-06] + +must_haves: + truths: + - "the keys command (Plan 05) can list findings with filters" + - "the keys command can fetch a single finding by ID" + - "the keys command can delete a finding by ID" + - "existing db.ListFindings remains backward compatible" + artifacts: + - path: pkg/storage/queries.go + provides: "Filters, ListFindingsFiltered, GetFinding, DeleteFinding" + exports: ["Filters", "ListFindingsFiltered", "GetFinding", "DeleteFinding"] + key_links: + - from: pkg/storage/queries.go + to: pkg/storage/findings.go + via: "Reuses encrypt/decrypt + Finding struct" + pattern: "Decrypt\\(encrypted, encKey\\)" +--- + + +Add a thin query layer on top of findings.go providing filtered list, single-record lookup, and delete. These are the DB primitives the `keyhunter keys` command tree (Plan 05) will call. + +Purpose: Key management (KEYS-01, KEYS-02, KEYS-06 foundation; KEYS-03/04/05 built on top in Plan 05). +Output: `pkg/storage/queries.go` + tests using in-memory SQLite (":memory:"). + + + +@$HOME/.claude/get-shit-done/workflows/execute-plan.md +@$HOME/.claude/get-shit-done/templates/summary.md + + + +@.planning/phases/06-output-reporting/06-CONTEXT.md +@pkg/storage/findings.go +@pkg/storage/db.go + + +From pkg/storage/findings.go: +```go +type Finding struct { + ID int64 + ScanID int64 + ProviderName string + KeyValue string // plaintext after decrypt + KeyMasked string + Confidence string + SourcePath string + SourceType string + LineNumber int + CreatedAt time.Time + Verified bool + VerifyStatus string + VerifyHTTPCode int + VerifyMetadata map[string]string +} +func (db *DB) SaveFinding(f Finding, encKey []byte) (int64, error) +func (db *DB) ListFindings(encKey []byte) ([]Finding, error) +``` +DB schema columns for findings table: id, scan_id, provider_name, key_value (encrypted BLOB), key_masked, confidence, source_path, source_type, line_number, verified, verify_status, verify_http_code, verify_metadata_json, created_at. + + + + + + + Task 1: Filters struct, ListFindingsFiltered, GetFinding, DeleteFinding + pkg/storage/queries.go, pkg/storage/queries_test.go + + - pkg/storage/findings.go (row scan helpers, columns) + - pkg/storage/db.go (Open, schema usage) + - pkg/storage/keys.go or wherever DeriveKey/NewSalt live (for test setup) + + + - Filters: { Provider string, Verified *bool, Limit int, Offset int }. + - ListFindingsFiltered(encKey, filters) returns decrypted findings matching: + * Provider (exact match, empty = no filter) + * Verified (nil = no filter; ptr-bool matches 1/0) + * ORDER BY created_at DESC, id DESC + * Limit/Offset applied only when Limit > 0 + - GetFinding(id, encKey) returns *Finding or (nil, sql.ErrNoRows) when absent. + - DeleteFinding(id) runs DELETE; returns (rowsAffected int64, error). Zero rows affected is not an error (caller decides). + - Tests (using ":memory:" DB + DeriveKey with a test salt): + * Seed 3 findings across 2 providers with mixed verified status. + * TestListFindingsFiltered_ByProvider: filter provider=="openai" returns only openai rows. + * TestListFindingsFiltered_Verified: Verified=&true returns only verified rows. + * TestListFindingsFiltered_Pagination: Limit=1, Offset=1 returns second row. + * TestGetFinding_Hit: returns row with decrypted KeyValue matching original. + * TestGetFinding_Miss: returns nil, sql.ErrNoRows for id=9999. + * TestDeleteFinding_Hit: rowsAffected==1, subsequent Get returns sql.ErrNoRows. + * TestDeleteFinding_Miss: rowsAffected==0, no error. + + + Create pkg/storage/queries.go: + + ```go + package storage + + import ( + "database/sql" + "encoding/json" + "fmt" + "strings" + "time" + ) + + // Filters selects a subset of findings for ListFindingsFiltered. + // Empty Provider means "any provider". Nil Verified means "any verified state". + // Limit <= 0 disables pagination. + type Filters struct { + Provider string + Verified *bool + Limit int + Offset int + } + + // ListFindingsFiltered returns findings matching the given filters, newest first. + // Key values are decrypted before return. encKey must match the key used at save time. + func (db *DB) ListFindingsFiltered(encKey []byte, f Filters) ([]Finding, error) { + var ( + where []string + args []interface{} + ) + if f.Provider != "" { + where = append(where, "provider_name = ?") + args = append(args, f.Provider) + } + if f.Verified != nil { + where = append(where, "verified = ?") + if *f.Verified { + args = append(args, 1) + } else { + args = append(args, 0) + } + } + q := `SELECT id, scan_id, provider_name, key_value, key_masked, confidence, + source_path, source_type, line_number, + verified, verify_status, verify_http_code, verify_metadata_json, + created_at + FROM findings` + if len(where) > 0 { + q += " WHERE " + strings.Join(where, " AND ") + } + q += " ORDER BY created_at DESC, id DESC" + if f.Limit > 0 { + q += " LIMIT ? OFFSET ?" + args = append(args, f.Limit, f.Offset) + } + rows, err := db.sql.Query(q, args...) + if err != nil { + return nil, fmt.Errorf("querying findings: %w", err) + } + defer rows.Close() + var out []Finding + for rows.Next() { + f, err := scanFindingRow(rows, encKey) + if err != nil { + return nil, err + } + out = append(out, f) + } + return out, rows.Err() + } + + // GetFinding returns a single finding by id. Returns sql.ErrNoRows if absent. + func (db *DB) GetFinding(id int64, encKey []byte) (*Finding, error) { + row := db.sql.QueryRow( + `SELECT id, scan_id, provider_name, key_value, key_masked, confidence, + source_path, source_type, line_number, + verified, verify_status, verify_http_code, verify_metadata_json, + created_at + FROM findings WHERE id = ?`, id) + f, err := scanFindingRowFromRow(row, encKey) + if err != nil { + return nil, err + } + return &f, nil + } + + // DeleteFinding removes the finding with the given id. + // Returns the number of rows affected (0 if no such id). + func (db *DB) DeleteFinding(id int64) (int64, error) { + res, err := db.sql.Exec(`DELETE FROM findings WHERE id = ?`, id) + if err != nil { + return 0, fmt.Errorf("deleting finding %d: %w", id, err) + } + return res.RowsAffected() + } + + // scanFindingRow reads one Finding from *sql.Rows and decrypts its key. + func scanFindingRow(rows *sql.Rows, encKey []byte) (Finding, error) { + var f Finding + var encrypted []byte + var createdAt string + var scanID sql.NullInt64 + var verifiedInt int + var metaJSON sql.NullString + if err := rows.Scan( + &f.ID, &scanID, &f.ProviderName, &encrypted, &f.KeyMasked, + &f.Confidence, &f.SourcePath, &f.SourceType, &f.LineNumber, + &verifiedInt, &f.VerifyStatus, &f.VerifyHTTPCode, &metaJSON, + &createdAt, + ); err != nil { + return f, fmt.Errorf("scanning finding row: %w", err) + } + return hydrateFinding(f, encrypted, scanID, verifiedInt, metaJSON, createdAt, encKey) + } + + func scanFindingRowFromRow(row *sql.Row, encKey []byte) (Finding, error) { + var f Finding + var encrypted []byte + var createdAt string + var scanID sql.NullInt64 + var verifiedInt int + var metaJSON sql.NullString + if err := row.Scan( + &f.ID, &scanID, &f.ProviderName, &encrypted, &f.KeyMasked, + &f.Confidence, &f.SourcePath, &f.SourceType, &f.LineNumber, + &verifiedInt, &f.VerifyStatus, &f.VerifyHTTPCode, &metaJSON, + &createdAt, + ); err != nil { + return f, err // includes sql.ErrNoRows — let caller detect + } + return hydrateFinding(f, encrypted, scanID, verifiedInt, metaJSON, createdAt, encKey) + } + + func hydrateFinding(f Finding, encrypted []byte, scanID sql.NullInt64, verifiedInt int, metaJSON sql.NullString, createdAt string, encKey []byte) (Finding, error) { + if scanID.Valid { + f.ScanID = scanID.Int64 + } + f.Verified = verifiedInt != 0 + if metaJSON.Valid && metaJSON.String != "" { + m := map[string]string{} + if err := json.Unmarshal([]byte(metaJSON.String), &m); err != nil { + return f, fmt.Errorf("unmarshaling verify metadata for finding %d: %w", f.ID, err) + } + f.VerifyMetadata = m + } + plain, err := Decrypt(encrypted, encKey) + if err != nil { + return f, fmt.Errorf("decrypting finding %d: %w", f.ID, err) + } + f.KeyValue = string(plain) + f.CreatedAt, _ = time.Parse("2006-01-02 15:04:05", createdAt) + return f, nil + } + ``` + + Create pkg/storage/queries_test.go with the seven tests from . Use `storage.Open(":memory:")`, generate salt, DeriveKey, SaveFinding seed rows with distinct providers/verified flags, then exercise each query. Use testify assertions. + + + cd /home/salva/Documents/apikey && go test ./pkg/storage/... -run "TestListFindingsFiltered|TestGetFinding|TestDeleteFinding" -count=1 + + + - All seven query tests pass + - `grep -q "ListFindingsFiltered\|GetFinding\|DeleteFinding" pkg/storage/queries.go` + - `go build ./...` succeeds + - Existing `pkg/storage/...` tests still pass (no regressions in ListFindings) + + + + + + +- `go test ./pkg/storage/... -count=1` all green +- `grep -q "type Filters struct" pkg/storage/queries.go` + + + +- Filters struct supports provider, verified, pagination +- GetFinding returns sql.ErrNoRows on miss +- DeleteFinding returns rows affected +- All tests green with in-memory DB + + + +After completion, create `.planning/phases/06-output-reporting/06-04-SUMMARY.md`. + diff --git a/.planning/phases/06-output-reporting/06-05-PLAN.md b/.planning/phases/06-output-reporting/06-05-PLAN.md new file mode 100644 index 0000000..d82bcc8 --- /dev/null +++ b/.planning/phases/06-output-reporting/06-05-PLAN.md @@ -0,0 +1,316 @@ +--- +phase: 06-output-reporting +plan: 05 +type: execute +wave: 2 +depends_on: [06-01, 06-02, 06-03, 06-04] +files_modified: + - cmd/keys.go + - cmd/keys_test.go + - cmd/stubs.go + - cmd/root.go +autonomous: true +requirements: [KEYS-01, KEYS-02, KEYS-03, KEYS-04, KEYS-05, KEYS-06] + +must_haves: + truths: + - "keyhunter keys list prints stored findings (masked by default)" + - "keyhunter keys show prints a single finding in full detail" + - "keyhunter keys export --format=json|csv writes all findings to stdout or --output file" + - "keyhunter keys copy places the full key on the system clipboard" + - "keyhunter keys delete removes a finding with confirmation (bypassed by --yes)" + - "keyhunter keys verify re-runs HTTPVerifier against the stored key" + artifacts: + - path: cmd/keys.go + provides: "keysCmd + list/show/export/copy/delete/verify subcommands" + contains: "keysCmd" + key_links: + - from: cmd/keys.go + to: pkg/storage/queries.go + via: "db.ListFindingsFiltered / GetFinding / DeleteFinding" + pattern: "ListFindingsFiltered|GetFinding|DeleteFinding" + - from: cmd/keys.go + to: pkg/output/formatter.go + via: "output.Get for json/csv export" + pattern: "output\\.Get\\(" + - from: cmd/keys.go + to: github.com/atotto/clipboard + via: "clipboard.WriteAll for keys copy" + pattern: "clipboard\\.WriteAll" + - from: cmd/root.go + to: cmd/keys.go + via: "AddCommand(keysCmd)" + pattern: "AddCommand\\(keysCmd\\)" +--- + + +Replace the `keys` stub with a real command tree implementing KEYS-01..06. Reuses the storage query layer from Plan 04 and the formatter registry from Plans 01-03. + +Purpose: Fulfils all six key-management requirements. +Output: `cmd/keys.go` with subcommands, removal of stub, tests for list/show/export/delete using in-memory DB. + + + +@$HOME/.claude/get-shit-done/workflows/execute-plan.md +@$HOME/.claude/get-shit-done/templates/summary.md + + + +@.planning/phases/06-output-reporting/06-CONTEXT.md +@.planning/phases/06-output-reporting/06-01-PLAN.md +@.planning/phases/06-output-reporting/06-04-PLAN.md +@cmd/scan.go +@cmd/stubs.go +@cmd/root.go +@pkg/storage/findings.go + + +From Plan 06-04: +```go +type Filters struct { Provider string; Verified *bool; Limit, Offset int } +func (db *DB) ListFindingsFiltered(encKey []byte, f Filters) ([]Finding, error) +func (db *DB) GetFinding(id int64, encKey []byte) (*Finding, error) +func (db *DB) DeleteFinding(id int64) (int64, error) +``` + +From cmd/scan.go, reusable helper: +```go +func loadOrCreateEncKey(db *storage.DB, passphrase string) ([]byte, error) +``` + +From Plan 06-01..03: +```go +output.Get("json"|"csv"|"sarif"|"table") (Formatter, error) +``` + +clipboard: github.com/atotto/clipboard (already in go.mod) — clipboard.WriteAll(string) error. + +verify package (Phase 5): verify.NewHTTPVerifier(timeout), VerifyAll(ctx, findings, reg, workers) — for `keys verify `. + + + + + + + Task 1: Implement keys command tree (list/show/export/copy/delete/verify) + cmd/keys.go, cmd/stubs.go, cmd/root.go + + - cmd/stubs.go (remove keysCmd stub) + - cmd/scan.go (loadOrCreateEncKey, db open pattern, verify wiring) + - cmd/root.go (AddCommand registration) + - pkg/storage/queries.go (Plan 04 output) + - pkg/output/formatter.go (Get, Options) + + + 1. Delete the `keysCmd` stub from cmd/stubs.go (keep the other stubs). Leave a comment if needed. + + 2. Create cmd/keys.go with the full command tree. Skeleton: + + ```go + package cmd + + import ( + "bufio" + "context" + "fmt" + "os" + "strconv" + "strings" + "time" + + "github.com/atotto/clipboard" + "github.com/salvacybersec/keyhunter/pkg/config" + "github.com/salvacybersec/keyhunter/pkg/engine" + "github.com/salvacybersec/keyhunter/pkg/output" + "github.com/salvacybersec/keyhunter/pkg/providers" + "github.com/salvacybersec/keyhunter/pkg/storage" + "github.com/salvacybersec/keyhunter/pkg/verify" + "github.com/spf13/cobra" + "github.com/spf13/viper" + ) + + var ( + flagKeysUnmask bool + flagKeysProvider string + flagKeysVerified bool + flagKeysVerifiedSet bool + flagKeysLimit int + flagKeysFormat string + flagKeysOutFile string + flagKeysYes bool + ) + + var keysCmd = &cobra.Command{ + Use: "keys", + Short: "Manage stored API key findings", + } + + // ... list/show/export/copy/delete/verify subcommands below + ``` + + 3. Implement subcommands: + + - `keys list`: + * Flags: --unmask, --provider=string, --verified (tri-state via Changed()), --limit=int + * Opens DB via same pattern as scan.go; derives encKey via loadOrCreateEncKey. + * Builds storage.Filters. If cmd.Flag("verified").Changed, set Verified=&flagKeysVerified. + * Calls db.ListFindingsFiltered. + * Converts storage.Finding -> engine.Finding (inline helper: storageToEngine(f)). + * Prepends ID column by printing a preamble line per finding or by printing a compact table. Simplest: iterate and print `[ID] provider confidence masked/unmask source:line verify_status` to stdout. Use lipgloss only if output.ColorsEnabled(os.Stdout). + * Footer: "N key(s).". + * Exit code 0 always for list (no findings is not an error). + + - `keys show `: + * Args: cobra.ExactArgs(1). Parse id as int64. + * db.GetFinding(id, encKey). If sql.ErrNoRows: "no finding with id N", exit 1. + * ALWAYS unmasked (per KEYS-02). + * Print labeled fields: ID, Provider, Confidence, Key (full), Source, Line, SourceType, CreatedAt, Verified, VerifyStatus, VerifyHTTPCode, VerifyMetadata (sorted keys), VerifyError. + + - `keys export`: + * Flags: --format=json|csv (default json), --output=file (default stdout). + * Rejects format != "json" && != "csv" with clear error (mentions SARIF is scan-only, for now). + * Looks up formatter via output.Get(flagKeysFormat). Unmask=true (export implies full keys per KEYS-03). + * If --output set: atomic write: write to .tmp then os.Rename. Use 0600 perms. + * Else write to os.Stdout. + + - `keys copy `: + * Args: ExactArgs(1). + * GetFinding; if not found exit 1. + * clipboard.WriteAll(f.KeyValue). + * Print "Copied key for finding # (, ) to clipboard." + + - `keys delete `: + * Args: ExactArgs(1). + * GetFinding first to show masked preview. + * If !flagKeysYes: prompt `"Delete finding #%d (%s, %s)? [y/N]: "` reading from stdin (bufio.NewReader). Accept "y"/"Y"/"yes". + * db.DeleteFinding(id). Print "Deleted finding #." or "No finding with id .". + + - `keys verify `: + * Args: ExactArgs(1). + * GetFinding; load providers.NewRegistry(); build one engine.Finding from the stored row. + * Use verify.EnsureConsent(db, os.Stdin, os.Stderr); if not granted, exit 2. + * verifier := verify.NewHTTPVerifier(10*time.Second); results := verifier.VerifyAll(ctx, []engine.Finding{f}, reg, 1). + * Read single result, apply to the stored record (re-SaveFinding with updated verify fields? — simpler: use a new helper `db.UpdateFindingVerify(id, status, httpCode, metadata, errMsg)`; if that helper doesn't exist, do it inline via `db.SQL().Exec("UPDATE findings SET verified=?, verify_status=?, verify_http_code=?, verify_metadata_json=? WHERE id=?", ...)` with JSON-marshaled metadata). + * Print the updated finding using the "show" rendering. + + 4. Helper: `storageToEngine(f storage.Finding) engine.Finding` — maps fields. Put it in cmd/keys.go as unexported. + + 5. Helper: `openDBWithKey() (*storage.DB, []byte, error)` that mirrors scan.go's DB-open sequence (config load, mkdir, storage.Open, loadOrCreateEncKey). Extract this so all keys subcommands share one path. + + 6. In cmd/root.go the existing `rootCmd.AddCommand(keysCmd)` line already registers the stub's keysCmd. Since cmd/keys.go now declares `var keysCmd`, ensure the old declaration in cmd/stubs.go is removed (Task 1 step 1) so there is exactly one declaration. Run `go build ./cmd/...` to confirm. + + 7. Register subcommands in an init() in cmd/keys.go: + ```go + func init() { + // list + keysListCmd.Flags().BoolVar(&flagKeysUnmask, "unmask", false, "show full key values") + keysListCmd.Flags().StringVar(&flagKeysProvider, "provider", "", "filter by provider name") + keysListCmd.Flags().BoolVar(&flagKeysVerified, "verified", false, "filter: verified only (use --verified=false for unverified only)") + keysListCmd.Flags().IntVar(&flagKeysLimit, "limit", 0, "max rows (0 = unlimited)") + // export + keysExportCmd.Flags().StringVar(&flagKeysFormat, "format", "json", "export format: json, csv") + keysExportCmd.Flags().StringVar(&flagKeysOutFile, "output", "", "write to file instead of stdout") + // delete + keysDeleteCmd.Flags().BoolVar(&flagKeysYes, "yes", false, "skip confirmation") + // wiring + keysCmd.AddCommand(keysListCmd, keysShowCmd, keysExportCmd, keysCopyCmd, keysDeleteCmd, keysVerifyCmd) + _ = viper.BindPFlag("keys.unmask", keysListCmd.Flags().Lookup("unmask")) + } + ``` + + + cd /home/salva/Documents/apikey && go build ./... && go vet ./cmd/... + + + - `go build ./...` succeeds + - `cmd/stubs.go` no longer declares keysCmd + - `cmd/keys.go` declares keysCmd + 6 subcommands + - `grep -q "keysListCmd\|keysShowCmd\|keysExportCmd\|keysCopyCmd\|keysDeleteCmd\|keysVerifyCmd" cmd/keys.go` + - `keyhunter keys --help` (via `go run ./ keys --help`) lists all 6 subcommands + + + + + Task 2: Integration tests for keys list/show/export/delete against in-memory DB + cmd/keys_test.go + + - cmd/keys.go (from Task 1) + - pkg/storage/queries.go + + + - Tests use a temp file SQLite DB (not :memory: because cobra commands open by path). + - Each test sets viper.Set("database.path", tmpPath) and config passphrase via env, seeds findings, then invokes the cobra subcommand via rootCmd.SetArgs() + Execute() OR directly invokes the RunE function with captured stdout. + - Prefer direct RunE invocation with cmd.SetOut/SetErr buffers to isolate from global os.Stdout. + - Seed: 3 findings (2 openai, 1 anthropic; one verified). + - Tests: + * TestKeysList_Default: output contains both providers and all 3 ids, key column masked. + * TestKeysList_FilterProvider: --provider=openai shows only 2 rows. + * TestKeysShow_Hit: `keys show ` output contains the full plaintext KeyValue, not masked. + * TestKeysShow_Miss: `keys show 9999` returns a non-nil error. + * TestKeysExport_JSON: --format=json to stdout parses as JSON array of length 3, unmasked keys present. + * TestKeysExport_CSVFile: --format=csv --output=; file exists, header row matches, 3 data rows. + * TestKeysDelete_WithYes: --yes deletes finding; subsequent list returns 2. + - Skip TestKeysCopy (clipboard not available in test env) and TestKeysVerify (requires network). Document skip with a comment. + + + Create cmd/keys_test.go using testify. Use t.TempDir() for DB path. Seed findings with a helper similar to storage/queries_test. For each test, reset viper state and flag vars between runs. + + Example scaffold: + ```go + func seedDB(t *testing.T) (string, func()) { + t.Helper() + dir := t.TempDir() + dbPath := filepath.Join(dir, "k.db") + viper.Set("database.path", dbPath) + t.Setenv("KEYHUNTER_PASSPHRASE", "test-pass") + db, err := storage.Open(dbPath) + require.NoError(t, err) + encKey, err := loadOrCreateEncKey(db, "test-pass") + require.NoError(t, err) + seed := []storage.Finding{ + {ProviderName: "openai", KeyValue: "sk-aaaaaaaaaaaaaaaaaaaa", KeyMasked: "sk-aaaa...aaaa", Confidence: "high", SourcePath: "a.go", LineNumber: 10}, + {ProviderName: "openai", KeyValue: "sk-bbbbbbbbbbbbbbbbbbbb", KeyMasked: "sk-bbbb...bbbb", Confidence: "medium", SourcePath: "b.go", LineNumber: 20, Verified: true, VerifyStatus: "live"}, + {ProviderName: "anthropic", KeyValue: "sk-ant-cccccccccccccccc", KeyMasked: "sk-ant-c...cccc", Confidence: "high", SourcePath: "c.go", LineNumber: 30}, + } + for _, f := range seed { + _, err := db.SaveFinding(f, encKey) + require.NoError(t, err) + } + require.NoError(t, db.Close()) + return dbPath, func() { viper.Reset() } + } + ``` + + Capture output by using `cmd.SetOut(buf); cmd.SetErr(buf)` then `cmd.Execute()` on a copy of keysCmd, OR directly call `keysListCmd.RunE(keysListCmd, []string{})` after redirecting `os.Stdout` to a pipe (prefer SetOut if subcommands write via `cmd.OutOrStdout()`; update keys.go to use that helper in Task 1 so tests are clean). + + NOTE: If Task 1 wrote fmt.Fprintln(os.Stdout, ...), adjust it to use `cmd.OutOrStdout()` to make these tests hermetic. This is a cheap refactor — do it during Task 2 if missed in Task 1. + + + cd /home/salva/Documents/apikey && go test ./cmd/... -run "TestKeysList|TestKeysShow|TestKeysExport|TestKeysDelete" -count=1 + + + - All listed keys tests pass + - `go test ./cmd/... -count=1` has no regressions + - Test file uses cmd.OutOrStdout() pattern (cmd/keys.go updated if needed) + + + + + + +- `go build ./...` succeeds +- `go test ./cmd/... ./pkg/storage/... ./pkg/output/... -count=1` all green +- Manual smoke: `go run . keys --help` lists list/show/export/copy/delete/verify + + + +- All of KEYS-01..06 are implemented +- keys export reuses the formatter registry (JSON/CSV) +- keys copy uses atotto/clipboard +- keys delete requires confirmation unless --yes +- Integration tests cover list/show/export/delete + + + +After completion, create `.planning/phases/06-output-reporting/06-05-SUMMARY.md`. + diff --git a/.planning/phases/06-output-reporting/06-06-PLAN.md b/.planning/phases/06-output-reporting/06-06-PLAN.md new file mode 100644 index 0000000..558041d --- /dev/null +++ b/.planning/phases/06-output-reporting/06-06-PLAN.md @@ -0,0 +1,242 @@ +--- +phase: 06-output-reporting +plan: 06 +type: execute +wave: 2 +depends_on: [06-01, 06-02, 06-03] +files_modified: + - cmd/scan.go + - cmd/scan_output_test.go +autonomous: true +requirements: [OUT-05, OUT-06] + +must_haves: + truths: + - "scan --output accepts table, json, sarif, csv and dispatches to the registered formatter" + - "invalid --output values fail with a clear error listing valid formats" + - "scan exit code is 0 on clean scans, 1 on findings, 2 on scan errors" + - "key masking is the default; --unmask propagates through opts.Unmask to all formatters" + artifacts: + - path: cmd/scan.go + provides: "Refactored output dispatch via output.Get + exit-code handling" + contains: "output.Get(flagOutput" + key_links: + - from: cmd/scan.go + to: pkg/output/formatter.go + via: "output.Get(flagOutput).Format(findings, os.Stdout, Options{Unmask, ToolName, ToolVersion})" + pattern: "output\\.Get\\(flagOutput" +--- + + +Wire the scan command to the formatter registry established in Plans 01-03. Replace the inline json/table switch with `output.Get(name)` and finalize exit-code semantics (0/1/2). + +Purpose: Surfaces all four output formats through `scan --output=` and enforces OUT-06 exit-code contract for CI/CD consumers. +Output: Updated `cmd/scan.go`, a small test file covering format dispatch and exit codes. + + + +@$HOME/.claude/get-shit-done/workflows/execute-plan.md +@$HOME/.claude/get-shit-done/templates/summary.md + + + +@.planning/phases/06-output-reporting/06-CONTEXT.md +@.planning/phases/06-output-reporting/06-01-PLAN.md +@cmd/scan.go +@cmd/root.go + + +From Plans 06-01..03: +```go +output.Get(name string) (Formatter, error) // returns ErrUnknownFormat wrapped +output.Names() []string +output.Options{ Unmask, ToolName, ToolVersion } +``` + +Current scan.go ends with: +```go +switch flagOutput { +case "json": + // inline jsonFinding encoder +default: + output.PrintFindings(findings, flagUnmask) +} +if len(findings) > 0 { os.Exit(1) } +return nil +``` + + + + + + + Task 1: Replace scan output switch with formatter registry and finalize exit codes + cmd/scan.go + + - cmd/scan.go (current dispatch) + - pkg/output/formatter.go (Get, Options, Names) + - cmd/root.go (for version constant — if none exists, use "dev") + + + 1. Remove the inline `jsonFinding` struct and the `switch flagOutput` block. + + 2. Replace with: + + ```go + // Output via the formatter registry (OUT-01..04). + formatter, err := output.Get(flagOutput) + if err != nil { + return fmt.Errorf("%w (valid: %s)", err, strings.Join(output.Names(), ", ")) + } + if err := formatter.Format(findings, os.Stdout, output.Options{ + Unmask: flagUnmask, + ToolName: "keyhunter", + ToolVersion: versionString(), // see step 4 + }); err != nil { + return fmt.Errorf("rendering %s output: %w", flagOutput, err) + } + + // OUT-06 exit codes: 0=clean, 1=findings, 2=error (errors returned via RunE -> root.Execute). + if len(findings) > 0 { + os.Exit(1) + } + return nil + ``` + + Add `"strings"` import if missing. + + 3. Update the --output flag help text: + ```go + scanCmd.Flags().StringVar(&flagOutput, "output", "table", "output format: table, json, sarif, csv") + ``` + + 4. Version helper: if cmd/root.go doesn't already export a version constant, add (in cmd/scan.go or a new cmd/version.go): + ```go + // versionString returns the compiled tool version. Set via -ldflags "-X github.com/salvacybersec/keyhunter/cmd.version=...". + var version = "dev" + func versionString() string { return version } + ``` + If a version constant already exists elsewhere, reuse it instead of adding a new one. + + 5. Confirm the rootCmd.Execute() path in cmd/root.go already handles errors by `os.Exit(1)` on non-nil. For the OUT-06 "exit 2 on error" requirement, update cmd/root.go Execute(): + ```go + func Execute() { + if err := rootCmd.Execute(); err != nil { + // cobra already prints the error message. Exit 2 signals scan/tool error + // per OUT-06. (Exit 1 is reserved for "findings present".) + os.Exit(2) + } + } + ``` + This is a one-line change from `os.Exit(1)` to `os.Exit(2)`. Any subcommand returning an error will now exit 2, which matches the CI contract (findings=1, error=2, clean=0). + + 6. Verify the old inline jsonFinding struct (and its imports — check if "encoding/json" is still used anywhere in scan.go; if not, remove the import). + + + cd /home/salva/Documents/apikey && go build ./... && go vet ./cmd/... + + + - `go build ./...` succeeds + - `grep -q "output\\.Get(flagOutput)" cmd/scan.go` + - `grep -q "output\\.Options{" cmd/scan.go` + - `grep -vq "jsonFinding" cmd/scan.go` (inline struct removed) + - `grep -q "os\\.Exit(2)" cmd/root.go` + - Help text lists all four formats + + + + + Task 2: Tests for unknown format error and exit-code contract + cmd/scan_output_test.go + + - cmd/scan.go (updated dispatch) + - pkg/output/formatter.go + + + - Test 1: TestScanOutput_UnknownFormat — setting flagOutput="bogus" and invoking a minimal code path that calls output.Get(flagOutput) returns an error whose message contains "unknown format" and lists valid names. + - Test 2: TestScanOutput_FormatNamesIncludeAll — output.Names() returns a slice containing "table", "json", "csv", "sarif". + - Rather than invoking the whole RunE (which requires a real scan target), isolate the dispatch logic into a small helper `renderScanOutput(findings, name string, unmask bool, w io.Writer) error` inside cmd/scan.go. Update Task 1 if not already done so the helper exists. Test exercises the helper directly. + - Test 3: TestRenderScanOutput_JSONSucceeds — passes an empty findings slice with name="json"; asserts output is valid JSON array `[]`. + - Test 4: TestRenderScanOutput_UnknownReturnsError — name="bogus"; asserts errors.Is(err, output.ErrUnknownFormat). + + + 1. If not done in Task 1, add `renderScanOutput(findings []engine.Finding, name string, unmask bool, w io.Writer) error` to cmd/scan.go and use it from the RunE. + + 2. Create cmd/scan_output_test.go: + + ```go + package cmd + + import ( + "bytes" + "encoding/json" + "errors" + "strings" + "testing" + + "github.com/salvacybersec/keyhunter/pkg/engine" + "github.com/salvacybersec/keyhunter/pkg/output" + "github.com/stretchr/testify/assert" + "github.com/stretchr/testify/require" + ) + + func TestScanOutput_FormatNamesIncludeAll(t *testing.T) { + names := output.Names() + for _, want := range []string{"table", "json", "csv", "sarif"} { + assert.Contains(t, names, want) + } + } + + func TestRenderScanOutput_UnknownReturnsError(t *testing.T) { + err := renderScanOutput(nil, "bogus", false, &bytes.Buffer{}) + require.Error(t, err) + assert.True(t, errors.Is(err, output.ErrUnknownFormat)) + assert.Contains(t, err.Error(), "valid:") + } + + func TestRenderScanOutput_JSONSucceeds(t *testing.T) { + var buf bytes.Buffer + err := renderScanOutput([]engine.Finding{}, "json", false, &buf) + require.NoError(t, err) + var out []any + require.NoError(t, json.Unmarshal(buf.Bytes(), &out)) + assert.Len(t, out, 0) + } + + func TestRenderScanOutput_TableEmpty(t *testing.T) { + var buf bytes.Buffer + err := renderScanOutput(nil, "table", false, &buf) + require.NoError(t, err) + assert.True(t, strings.Contains(buf.String(), "No API keys found")) + } + ``` + + + cd /home/salva/Documents/apikey && go test ./cmd/... -run "TestScanOutput|TestRenderScanOutput" -count=1 + + + - All four tests pass + - `grep -q "func renderScanOutput" cmd/scan.go` + - errors.Is(err, output.ErrUnknownFormat) works from cmd package + + + + + + +- `go build ./...` succeeds +- `go test ./... -count=1` all green +- Manual smoke: `go run . scan --output=bogus /tmp` prints "unknown format" and exits 2 +- Manual smoke: `go run . scan --output=sarif testdata/` prints SARIF JSON + + + +- scan --output dispatches to the formatter registry for all four formats +- Unknown format error lists valid names +- Exit codes: clean=0, findings=1, error=2 +- OUT-05 masking default respected via flagUnmask -> Options.Unmask + + + +After completion, create `.planning/phases/06-output-reporting/06-06-SUMMARY.md`. +