
sublimesec/vestibule本镜像为 Sublime Security 安全分析平台的官方摄入组件,负责安全数据的接收、预处理与标准化导入,是连接原始数据源与安全分析引擎的关键中间件。作为数据 pipeline 的首个环节,该组件确保各类异构安全数据(如日志、威胁情报、事件告警等)经过统一处理后,以结构化格式进入后续分析流程。
bashdocker run -d \ --name sublime-ingestion \ -e DATA_SOURCE_TYPE="kafka" \ -e KAFKA_BROKERS="kafka:9092" \ -e KAFKA_TOPIC="security-logs" \ -e OUTPUT_DESTINATION="sublime-engine:8080" \ -v ./config:/app/config \ sublimelabs/ingestion:latest
| 环境变量 | 描述 | 示例值 | 必选 |
|---|---|---|---|
DATA_SOURCE_TYPE | 数据源类型(api/file/kafka/rabbitmq) | kafka | 是 |
OUTPUT_DESTINATION | 数据输出目标(Sublime引擎地址) | [***] | 是 |
LOG_LEVEL | 日志级别(debug/info/warn/error) | info | 否 |
PROCESSING_THREADS | 数据处理线程数 | 4 | 否 |
MAX_BATCH_SIZE | 批量处理数据量 | 1000 | 否 |
通过挂载配置文件实现精细化规则定义,配置文件路径为 /app/config/ingestion.yaml,示例结构:
yamlsources: - type: kafka brokers: ["kafka-1:9092", "kafka-2:9092"] topic: "security-events" consumer_group: "sublime-ingestion" auto_offset_reset: "earliest" - type: file path: "/data/logs/*.log" format: "json" poll_interval: 30s processing: filters: - exclude_fields: ["sensitive_field", "internal_id"] - include_patterns: event_type: ["alert", "threat"] transformations: - field: "timestamp" action: "convert" target_format: "ISO8601" output: destination: "[***]" retry_policy: max_attempts: 3 backoff_delay: 5s
yamlversion: "3.8" services: ingestion: image: sublimelabs/ingestion:latest container_name: sublime-ingestion restart: always environment: - DATA_SOURCE_TYPE="kafka" - LOG_LEVEL="info" - PROCESSING_THREADS="4" volumes: - ./ingestion-config:/app/config - ./logs:/data/logs depends_on: - kafka - sublime-engine networks: - security-network networks: security-network: driver: bridge






manifest unknown 错误
TLS 证书验证失败
DNS 解析超时
410 错误:版本过低
402 错误:流量耗尽
身份认证失败错误
429 限流错误
凭证保存错误
来自真实用户的反馈,见证轩辕镜像的优质服务