feat(08-04): add 15 Censys + 10 ZoomEye dorks
- 15 Censys Search 2.0 queries for Ollama, vLLM, LocalAI, Open WebUI, LM Studio, Triton, TGI, LiteLLM, Portkey, LangServe, FastChat, text-generation-webui, Azure OpenAI certs, Bedrock certs, and OpenAI proxies (12 infrastructure + 3 frontier) - 10 ZoomEye app/title/port/service queries covering the same LLM infrastructure surface (9 infrastructure + 1 frontier) - Dual-located under pkg/dorks/definitions/ (embedded) and dorks/ (repo root)
This commit is contained in:
119
pkg/dorks/definitions/censys/all.yaml
Normal file
119
pkg/dorks/definitions/censys/all.yaml
Normal file
@@ -0,0 +1,119 @@
|
||||
- id: censys-ollama-11434
|
||||
name: "Ollama server on port 11434 (Censys)"
|
||||
source: censys
|
||||
category: infrastructure
|
||||
query: 'services.port: 11434 and services.http.response.body: "Ollama"'
|
||||
description: "Finds exposed Ollama LLM servers advertising on their default port via Censys."
|
||||
tags: [ollama, censys, infrastructure, tier1]
|
||||
|
||||
- id: censys-vllm
|
||||
name: "vLLM inference server (Censys)"
|
||||
source: censys
|
||||
category: infrastructure
|
||||
query: 'services.http.response.body: "vLLM" and services.http.response.body: "/v1/models"'
|
||||
description: "Locates vLLM servers exposing their OpenAI-compatible models endpoint."
|
||||
tags: [vllm, censys, infrastructure]
|
||||
|
||||
- id: censys-localai
|
||||
name: "LocalAI host (Censys)"
|
||||
source: censys
|
||||
category: infrastructure
|
||||
query: 'services.http.response.html_title: "LocalAI"'
|
||||
description: "Finds LocalAI self-hosted OpenAI-compatible servers by their HTML title."
|
||||
tags: [localai, censys, infrastructure]
|
||||
|
||||
- id: censys-openwebui
|
||||
name: "Open WebUI dashboard (Censys)"
|
||||
source: censys
|
||||
category: infrastructure
|
||||
query: 'services.http.response.html_title: "Open WebUI"'
|
||||
description: "Finds internet-exposed Open WebUI dashboards that front LLM backends."
|
||||
tags: [openwebui, censys, infrastructure]
|
||||
|
||||
- id: censys-lmstudio
|
||||
name: "LM Studio server (Censys)"
|
||||
source: censys
|
||||
category: infrastructure
|
||||
query: 'services.http.response.html_title: "LM Studio"'
|
||||
description: "Finds exposed LM Studio local-model servers."
|
||||
tags: [lmstudio, censys, infrastructure]
|
||||
|
||||
- id: censys-triton
|
||||
name: "NVIDIA Triton inference server (Censys)"
|
||||
source: censys
|
||||
category: infrastructure
|
||||
query: 'services.http.response.body: "NVIDIA Triton" and services.http.response.body: "/v2/models"'
|
||||
description: "Finds NVIDIA Triton model servers exposing their /v2/models catalog."
|
||||
tags: [triton, nvidia, censys, infrastructure]
|
||||
|
||||
- id: censys-tgi
|
||||
name: "Hugging Face text-generation-inference (Censys)"
|
||||
source: censys
|
||||
category: infrastructure
|
||||
query: 'services.http.response.body: "text-generation-inference"'
|
||||
description: "Finds public text-generation-inference (TGI) instances."
|
||||
tags: [tgi, huggingface, censys, infrastructure]
|
||||
|
||||
- id: censys-litellm
|
||||
name: "LiteLLM proxy on :4000 (Censys)"
|
||||
source: censys
|
||||
category: infrastructure
|
||||
query: 'services.http.response.html_title: "LiteLLM" and services.port: 4000'
|
||||
description: "Finds LiteLLM proxy servers on their default admin port."
|
||||
tags: [litellm, censys, infrastructure]
|
||||
|
||||
- id: censys-portkey
|
||||
name: "Portkey AI gateway (Censys)"
|
||||
source: censys
|
||||
category: infrastructure
|
||||
query: 'services.http.response.html_title: "Portkey"'
|
||||
description: "Finds self-hosted Portkey AI gateway deployments."
|
||||
tags: [portkey, censys, infrastructure]
|
||||
|
||||
- id: censys-langserve
|
||||
name: "LangServe endpoint (Censys)"
|
||||
source: censys
|
||||
category: infrastructure
|
||||
query: 'services.http.response.html_title: "LangServe"'
|
||||
description: "Finds LangServe (LangChain) API servers exposed on the public internet."
|
||||
tags: [langserve, langchain, censys, infrastructure]
|
||||
|
||||
- id: censys-openai-azure-cert
|
||||
name: "Azure OpenAI TLS certificates (Censys)"
|
||||
source: censys
|
||||
category: frontier
|
||||
query: 'services.tls.certificates.leaf_data.subject.common_name: "openai.azure.com"'
|
||||
description: "Finds hosts presenting certificates for openai.azure.com subject CN."
|
||||
tags: [openai, azure, censys, frontier, tls]
|
||||
|
||||
- id: censys-bedrock-cert
|
||||
name: "AWS Bedrock runtime certificates (Censys)"
|
||||
source: censys
|
||||
category: frontier
|
||||
query: 'services.tls.certificates.leaf_data.subject.common_name: "bedrock-runtime"'
|
||||
description: "Finds hosts exposing certs referencing AWS Bedrock runtime CN."
|
||||
tags: [bedrock, aws, censys, frontier, tls]
|
||||
|
||||
- id: censys-fastchat
|
||||
name: "FastChat LLM server (Censys)"
|
||||
source: censys
|
||||
category: infrastructure
|
||||
query: 'services.http.response.html_title: "FastChat"'
|
||||
description: "Finds FastChat multi-model serving dashboards."
|
||||
tags: [fastchat, censys, infrastructure]
|
||||
|
||||
- id: censys-textgen-webui
|
||||
name: "oobabooga text-generation-webui (Censys)"
|
||||
source: censys
|
||||
category: infrastructure
|
||||
query: 'services.http.response.html_title: "text-generation-webui"'
|
||||
description: "Finds exposed oobabooga text-generation-webui instances."
|
||||
tags: [oobabooga, textgen, censys, infrastructure]
|
||||
|
||||
- id: censys-openai-proxy
|
||||
name: "OpenAI-compatible proxy leaking key var (Censys)"
|
||||
source: censys
|
||||
category: frontier
|
||||
query: 'services.http.response.body: "/v1/chat/completions" and services.http.response.body: "OPENAI_API_KEY"'
|
||||
description: "Finds OpenAI-compatible proxies whose bodies leak the OPENAI_API_KEY env reference."
|
||||
tags: [openai, proxy, censys, frontier]
|
||||
79
pkg/dorks/definitions/zoomeye/all.yaml
Normal file
79
pkg/dorks/definitions/zoomeye/all.yaml
Normal file
@@ -0,0 +1,79 @@
|
||||
- id: zoomeye-ollama
|
||||
name: "Ollama on :11434 (ZoomEye)"
|
||||
source: zoomeye
|
||||
category: infrastructure
|
||||
query: 'port:11434 +app:"Ollama"'
|
||||
description: "Finds Ollama servers on their default port via ZoomEye app fingerprinting."
|
||||
tags: [ollama, zoomeye, infrastructure, tier1]
|
||||
|
||||
- id: zoomeye-vllm
|
||||
name: "vLLM title (ZoomEye)"
|
||||
source: zoomeye
|
||||
category: infrastructure
|
||||
query: 'title:"vLLM" +app:"nginx"'
|
||||
description: "Finds vLLM servers fronted by nginx via ZoomEye title match."
|
||||
tags: [vllm, zoomeye, infrastructure]
|
||||
|
||||
- id: zoomeye-localai
|
||||
name: "LocalAI title (ZoomEye)"
|
||||
source: zoomeye
|
||||
category: infrastructure
|
||||
query: 'title:"LocalAI"'
|
||||
description: "Finds LocalAI self-hosted servers by HTML title."
|
||||
tags: [localai, zoomeye, infrastructure]
|
||||
|
||||
- id: zoomeye-openwebui
|
||||
name: "Open WebUI title (ZoomEye)"
|
||||
source: zoomeye
|
||||
category: infrastructure
|
||||
query: 'title:"Open WebUI"'
|
||||
description: "Finds exposed Open WebUI dashboards via ZoomEye."
|
||||
tags: [openwebui, zoomeye, infrastructure]
|
||||
|
||||
- id: zoomeye-litellm
|
||||
name: "LiteLLM proxy :4000 (ZoomEye)"
|
||||
source: zoomeye
|
||||
category: infrastructure
|
||||
query: 'title:"LiteLLM" +port:4000'
|
||||
description: "Finds LiteLLM proxies on their default admin port via ZoomEye."
|
||||
tags: [litellm, zoomeye, infrastructure]
|
||||
|
||||
- id: zoomeye-lmstudio
|
||||
name: "LM Studio title (ZoomEye)"
|
||||
source: zoomeye
|
||||
category: infrastructure
|
||||
query: 'title:"LM Studio"'
|
||||
description: "Finds exposed LM Studio local-model servers."
|
||||
tags: [lmstudio, zoomeye, infrastructure]
|
||||
|
||||
- id: zoomeye-triton-grpc
|
||||
name: "Triton gRPC :8001 (ZoomEye)"
|
||||
source: zoomeye
|
||||
category: infrastructure
|
||||
query: 'port:8001 +service:"triton"'
|
||||
description: "Finds NVIDIA Triton gRPC endpoints exposed on their default port."
|
||||
tags: [triton, nvidia, zoomeye, infrastructure]
|
||||
|
||||
- id: zoomeye-fastchat
|
||||
name: "FastChat title (ZoomEye)"
|
||||
source: zoomeye
|
||||
category: infrastructure
|
||||
query: 'title:"FastChat"'
|
||||
description: "Finds FastChat multi-model serving dashboards via ZoomEye."
|
||||
tags: [fastchat, zoomeye, infrastructure]
|
||||
|
||||
- id: zoomeye-langserve
|
||||
name: "LangServe title (ZoomEye)"
|
||||
source: zoomeye
|
||||
category: infrastructure
|
||||
query: 'title:"LangServe"'
|
||||
description: "Finds LangServe (LangChain) servers via ZoomEye."
|
||||
tags: [langserve, langchain, zoomeye, infrastructure]
|
||||
|
||||
- id: zoomeye-openai-proxy
|
||||
name: "OpenAI-compatible proxy (ZoomEye)"
|
||||
source: zoomeye
|
||||
category: frontier
|
||||
query: 'title:"openai" +"/v1/chat/completions"'
|
||||
description: "Finds OpenAI-compatible proxy servers advertising chat completions endpoint."
|
||||
tags: [openai, proxy, zoomeye, frontier]
|
||||
Reference in New Issue
Block a user