feat(03-05): add Ollama, vLLM, LocalAI, LM Studio, llama.cpp provider YAMLs
- 5 Tier 8 self-hosted runtime provider definitions (keyword-only) - Localhost endpoints and env var anchors for OSINT correlation - Dual-located in providers/ and pkg/providers/definitions/
This commit is contained in:
19
providers/ollama.yaml
Normal file
19
providers/ollama.yaml
Normal file
@@ -0,0 +1,19 @@
|
||||
format_version: 1
|
||||
name: ollama
|
||||
display_name: Ollama
|
||||
tier: 8
|
||||
last_verified: "2026-04-05"
|
||||
keywords:
|
||||
- "ollama"
|
||||
- "OLLAMA_HOST"
|
||||
- "OLLAMA_API_KEY"
|
||||
- "OLLAMA_MODELS"
|
||||
- "localhost:11434"
|
||||
- "127.0.0.1:11434"
|
||||
- "api/generate"
|
||||
verify:
|
||||
method: GET
|
||||
url: ""
|
||||
headers: {}
|
||||
valid_status: []
|
||||
invalid_status: []
|
||||
Reference in New Issue
Block a user