9.1 KiB
phase, plan, type, wave, depends_on, files_modified, autonomous, requirements, must_haves
| phase | plan | type | wave | depends_on | files_modified | autonomous | requirements | must_haves | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 12-osint_iot_cloud_storage | 03 | execute | 1 |
|
true |
|
|
Purpose: Enable discovery of API keys leaked in publicly accessible cloud storage buckets across AWS, GCP, Azure, and DigitalOcean. Output: Four source files + tests following the established Phase 10 pattern.
Note on RECON-CLOUD-03 (MinIO via Shodan) and RECON-CLOUD-04 (GrayHatWarfare): These are addressed here. MinIO discovery is implemented as a Shodan query variant within S3Scanner (MinIO uses S3-compatible API). GrayHatWarfare is implemented as a dedicated scanner that queries the GrayHatWarfare buckets.grayhatwarfare.com API.
<execution_context> @$HOME/.claude/get-shit-done/workflows/execute-plan.md @$HOME/.claude/get-shit-done/templates/summary.md </execution_context>
@.planning/PROJECT.md @.planning/ROADMAP.md @.planning/STATE.md @pkg/recon/source.go @pkg/recon/sources/httpclient.go @pkg/recon/sources/bing.go @pkg/recon/sources/queries.go @pkg/recon/sources/register.go From pkg/recon/source.go: ```go type ReconSource interface { Name() string RateLimit() rate.Limit Burst() int RespectsRobots() bool Enabled(cfg Config) bool Sweep(ctx context.Context, query string, out chan<- Finding) error } ```From pkg/recon/sources/httpclient.go:
type Client struct { HTTP *http.Client; MaxRetries int; UserAgent string }
func NewClient() *Client
func (c *Client) Do(ctx context.Context, req *http.Request) (*http.Response, error)
GCSScanner (gcsscanner.go) — RECON-CLOUD-02:
- Struct:
GCSScannerwith fieldsRegistry *providers.Registry,Limiters *recon.LimiterRegistry,BaseURL string,client *Client - Name(): "gcs"
- RateLimit(): rate.Every(500 * time.Millisecond)
- Burst(): 3
- RespectsRobots(): false
- Enabled(): always true (credentialless)
- Sweep(): Same bucket enumeration pattern as S3Scanner but using
https://storage.googleapis.com/{bucket}for HEAD and listing. GCS public bucket listing returns JSON when Accept: application/json is set. Parse{"items":[{"name":"..."}]}. Emit findings for config-pattern files with Source=gs://{bucket}/{name}, SourceType="recon:gcs".
Both sources share a common bucketNames helper function — define it in s3scanner.go and export it for use by both.
cd /home/salva/Documents/apikey/.claude/worktrees/agent-a6700ee2 && go build ./pkg/recon/sources/
S3Scanner and GCSScanner compile and implement recon.ReconSource
DOSpacesScanner (dospaces.go) — RECON-CLOUD-02:
- Struct:
DOSpacesScannerwith fieldsRegistry *providers.Registry,Limiters *recon.LimiterRegistry,BaseURL string,client *Client - Name(): "spaces"
- RateLimit(): rate.Every(500 * time.Millisecond)
- Burst(): 3
- RespectsRobots(): false
- Enabled(): always true (credentialless)
- Sweep(): Uses bucket enumeration with DO Spaces URL format
https://{bucket}.{region}.digitaloceanspaces.com/. Iterate regions: nyc3, sfo3, ams3, sgp1, fra1. Same XML ListBucket format as S3 (DO Spaces is S3-compatible). Emit findings with Source=do://{bucket}/{key}, SourceType="recon:spaces".
Tests (all four test files): Each test file follows the httptest pattern:
- Mock server returns appropriate XML/JSON for bucket listing
- Verify Sweep emits correct number of findings with correct SourceType and Source URL format
- Verify Enabled() returns true (credentialless sources)
- Test with empty registry (no keywords => no bucket names => no findings)
- Test context cancellation
Use a minimal providers.Registry with 1 test provider having keyword "testprov" so bucket names like "testprov-keys" are generated. cd /home/salva/Documents/apikey/.claude/worktrees/agent-a6700ee2 && go test ./pkg/recon/sources/ -run "TestS3Scanner|TestGCSScanner|TestAzureBlob|TestDOSpaces" -v -count=1 All four cloud scanner sources compile and pass tests; each emits findings with correct source type and URL format
- `go build ./pkg/recon/sources/` compiles without errors - `go test ./pkg/recon/sources/ -run "TestS3Scanner|TestGCSScanner|TestAzureBlob|TestDOSpaces" -v` all pass - Each source file has compile-time assertion<success_criteria> Four cloud storage scanners (S3, GCS, Azure Blob, DO Spaces) implement recon.ReconSource with credentialless public bucket enumeration, use shared Client for HTTP, and pass unit tests. </success_criteria>
After completion, create `.planning/phases/12-osint_iot_cloud_storage/12-03-SUMMARY.md`