feat(03-05): add Ollama, vLLM, LocalAI, LM Studio, llama.cpp provider YAMLs

- 5 Tier 8 self-hosted runtime provider definitions (keyword-only)
- Localhost endpoints and env var anchors for OSINT correlation
- Dual-located in providers/ and pkg/providers/definitions/
This commit is contained in:
salvacybersec
2026-04-05 14:41:35 +03:00
parent a318b9d89f
commit 370dca0cbb
10 changed files with 178 additions and 0 deletions

18
providers/vllm.yaml Normal file
View File

@@ -0,0 +1,18 @@
format_version: 1
name: vllm
display_name: vLLM
tier: 8
last_verified: "2026-04-05"
keywords:
- "vllm"
- "VLLM_API_KEY"
- "vllm-openai"
- "--api-key"
- "openai.api_server"
- "vllm.entrypoints"
verify:
method: GET
url: ""
headers: {}
valid_status: []
invalid_status: []