snowplow/lake-loader-gcp该镜像为Snowplow数据处理生态的组件之一,专为GCP(Google Cloud Platform)环境设计,用于从Snowplow GCP数据管道中提取事件数据,并将其转换为Delta Lake和Apache Iceberg格式,支持数据湖集成与后续分析处理。
bashdocker run -d \ --name snowplow-gcp-loader \ -e GCP_PROJECT=my-gcp-project \ -e INPUT_SUBSCRIPTION=projects/my-gcp-project/subscriptions/snowplow-events-sub \ -e DELTA_OUTPUT_PATH=gs://my-datalake/delta/snowplow-events \ -e ICEBERG_OUTPUT_PATH=gs://my-datalake/iceberg/snowplow-events \ -e RUN_MODE=streaming \ snowplow/gcp-delta-iceberg-loader:latest
| 环境变量 | 描述 | 示例值 | 必需 |
|---|---|---|---|
GCP_PROJECT | GCP项目ID | my-gcp-project | 是 |
INPUT_SUBSCRIPTION | GCP Pub/Sub订阅路径(事件数据源) | projects/my-gcp-project/subscriptions/events-sub | 是 |
DELTA_OUTPUT_PATH | Delta格式数据输出路径(GCS路径) | gs://datalake/delta/events | 否 |
ICEBERG_OUTPUT_PATH | Iceberg格式数据输出路径(GCS路径) | gs://datalake/iceberg/events | 否 |
RUN_MODE | 运行模式(streaming/batch) | streaming | 否 |
BATCH_SIZE | 批处理模式下的事件批次大小 | *** | 否 |
DELTA_TABLE_PROPERTIES | Delta表属性(JSON格式) | {"delta.logRetentionDuration":"7 days"} | 否 |
注意:
DELTA_OUTPUT_PATH和ICEBERG_OUTPUT_PATH至少需配置一项,否则容器将无法启动。
通过挂载外部配置文件实现更精细的控制:
bashdocker run -d \ --name snowplow-gcp-loader \ -v /local/config:/app/config \ -e CONFIG_PATH=/app/config/custom-config.yaml \ snowplow/gcp-delta-iceberg-loader:latest
配置文件(YAML格式)示例:
yamlgcp: project: my-gcp-project credentials_path: /secrets/gcp-key.json input: subscription: projects/my-gcp-project/subscriptions/events-sub format: snowplow-enriched-json output: delta: path: gs://datalake/delta/events partition_columns: [event_date, event_type] table_properties: delta.logRetentionDuration: "30 days" delta.deletedFileRetentionDuration: "7 days" iceberg: path: gs://datalake/iceberg/events partition_spec: event_date:date, event_type:string properties: write.metadata.delete-after-commit.enabled: "true" processing: mode: streaming checkpoint_location: gs://datalake/checkpoints/snowplow-loader
manifest unknown 错误
TLS 证书验证失败
DNS 解析超时
410 错误:版本过低
402 错误:流量耗尽
身份认证失败错误
429 限流错误
凭证保存错误
来自真实用户的反馈,见证轩辕镜像的优质服务