8.5 KiB
phase, plan, type, wave, depends_on, files_modified, autonomous, requirements, must_haves
| phase | plan | type | wave | depends_on | files_modified | autonomous | requirements | must_haves | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 03-tier-3-9-providers | 06 | execute | 1 |
|
true |
|
|
Purpose: Satisfy PROV-09 (8 Tier 9 Enterprise providers). These target regulated enterprise estates where leaked keys carry high blast radius.
Output: 16 YAML files.
Addresses PROV-09.
Note: oracle-genai here is distinct from oracle-ai in plan 03-04 (which is Tier 7 code tools). Tier 9 entry uses name oracle-genai to avoid collision.
<execution_context> @$HOME/.claude/get-shit-done/workflows/execute-plan.md @$HOME/.claude/get-shit-done/templates/summary.md </execution_context>
@.planning/ROADMAP.md @.planning/phases/03-tier-3-9-providers/03-CONTEXT.md @pkg/providers/schema.go Dual-location required. Keyword-only for most; databricks has a documented `dapi` prefix. Task 1: Salesforce, ServiceNow, SAP, Palantir YAMLs providers/salesforce-einstein.yaml, providers/servicenow.yaml, providers/sap-ai-core.yaml, providers/palantir.yaml, pkg/providers/definitions/salesforce-einstein.yaml, pkg/providers/definitions/servicenow.yaml, pkg/providers/definitions/sap-ai-core.yaml, pkg/providers/definitions/palantir.yaml - pkg/providers/schema.go All 4 use keyword-only detection (no public key formats documented).providers/salesforce-einstein.yaml:
format_version: 1
name: salesforce-einstein
display_name: Salesforce Einstein GPT
tier: 9
last_verified: "2026-04-05"
keywords:
- "einstein-gpt"
- "einsteinGPT"
- "SALESFORCE_CONSUMER_KEY"
- "SALESFORCE_CONSUMER_SECRET"
- "api.salesforce.com"
- "einstein.ai"
- "salesforce-einstein"
verify:
method: GET
url: ""
headers: {}
valid_status: []
invalid_status: []
providers/servicenow.yaml:
format_version: 1
name: servicenow
display_name: ServiceNow Now Assist
tier: 9
last_verified: "2026-04-05"
keywords:
- "servicenow"
- "now-assist"
- "SERVICENOW_INSTANCE"
- "SERVICENOW_USERNAME"
- "SERVICENOW_PASSWORD"
- "service-now.com"
verify:
method: GET
url: ""
headers: {}
valid_status: []
invalid_status: []
providers/sap-ai-core.yaml:
format_version: 1
name: sap-ai-core
display_name: SAP AI Core / Joule
tier: 9
last_verified: "2026-04-05"
keywords:
- "sap-ai-core"
- "sap-joule"
- "SAP_AICORE_CLIENT_ID"
- "SAP_AICORE_CLIENT_SECRET"
- "SAP_AICORE_AUTH_URL"
- "hana.ondemand.com"
- "aicore"
verify:
method: GET
url: ""
headers: {}
valid_status: []
invalid_status: []
providers/palantir.yaml:
format_version: 1
name: palantir
display_name: Palantir AIP
tier: 9
last_verified: "2026-04-05"
keywords:
- "palantir"
- "foundry"
- "PALANTIR_TOKEN"
- "FOUNDRY_TOKEN"
- "palantirfoundry.com"
- "aip-agents"
verify:
method: GET
url: ""
headers: {}
valid_status: []
invalid_status: []
Copy all 4 files verbatim to pkg/providers/definitions/.
cd /home/salva/Documents/apikey && for f in salesforce-einstein servicenow sap-ai-core palantir; do diff providers/$f.yaml pkg/providers/definitions/$f.yaml || exit 1; done && go test ./pkg/providers/... -count=1 && go test ./pkg/engine/... -count=1
<acceptance_criteria>
- All 8 files exist
- grep -q 'einsteinGPT' providers/salesforce-einstein.yaml
- grep -q 'foundry' providers/palantir.yaml
- grep -q 'SAP_AICORE_CLIENT_ID' providers/sap-ai-core.yaml
- go test ./pkg/providers/... -count=1 passes
</acceptance_criteria>
4 enterprise platform providers dual-located.
format_version: 1
name: databricks
display_name: Databricks (DBRX / Mosaic)
tier: 9
last_verified: "2026-04-05"
keywords:
- "databricks"
- "DATABRICKS_TOKEN"
- "DATABRICKS_HOST"
- "dbrx"
- "mosaicml"
- "dapi"
- ".cloud.databricks.com"
patterns:
- regex: 'dapi[a-f0-9]{32}(-[0-9]{1,2})?'
entropy_min: 3.5
confidence: high
verify:
method: GET
url: ""
headers: {}
valid_status: []
invalid_status: []
providers/snowflake.yaml:
format_version: 1
name: snowflake
display_name: Snowflake Cortex
tier: 9
last_verified: "2026-04-05"
keywords:
- "snowflake"
- "SNOWFLAKE_ACCOUNT"
- "SNOWFLAKE_USER"
- "SNOWFLAKE_PASSWORD"
- "SNOWFLAKE_PRIVATE_KEY"
- "snowflakecomputing.com"
- "cortex"
verify:
method: GET
url: ""
headers: {}
valid_status: []
invalid_status: []
providers/oracle-genai.yaml:
format_version: 1
name: oracle-genai
display_name: Oracle Cloud Generative AI Service
tier: 9
last_verified: "2026-04-05"
keywords:
- "oci-generative-ai"
- "OCI_GENAI_COMPARTMENT"
- "oracle-cloud-genai"
- "inference.generativeai.us-chicago-1"
- "oci-cli"
- "OCI_CONFIG_FILE"
verify:
method: GET
url: ""
headers: {}
valid_status: []
invalid_status: []
providers/hpe-greenlake.yaml:
format_version: 1
name: hpe-greenlake
display_name: HPE GreenLake for LLMs
tier: 9
last_verified: "2026-04-05"
keywords:
- "hpe-greenlake"
- "greenlake"
- "HPE_CLIENT_ID"
- "HPE_CLIENT_SECRET"
- "common.cloud.hpe.com"
- "hpe-ai"
verify:
method: GET
url: ""
headers: {}
valid_status: []
invalid_status: []
Copy all 4 files verbatim to pkg/providers/definitions/.
cd /home/salva/Documents/apikey && for f in databricks snowflake oracle-genai hpe-greenlake; do diff providers/$f.yaml pkg/providers/definitions/$f.yaml || exit 1; done && go test ./pkg/providers/... -count=1 && go test ./pkg/engine/... -count=1 && test $(grep -l 'tier: 9' providers/*.yaml | wc -l) -eq 8
<acceptance_criteria>
- All 8 files exist
- grep -q 'dapi' providers/databricks.yaml
- grep -q 'snowflakecomputing.com' providers/snowflake.yaml
- grep -q 'greenlake' providers/hpe-greenlake.yaml
- Total Tier 9 count = 8
- go test ./pkg/providers/... -count=1 passes
</acceptance_criteria>
All 8 Tier 9 enterprise providers dual-located. PROV-09 satisfied.
<success_criteria>
- 8 Tier 9 enterprise providers created
- Databricks uses documented
dapihigh-confidence pattern - Strong env var keyword anchors on all
- No engine regression </success_criteria>