
arm64v8/sparkarm64v8/spark 是针对ARM64架构优化的Apache Spark官方Docker镜像,基于spark官方镜像的arm64v8架构构建版本。Apache Spark是一个统一的大数据分析引擎,支持大规模数据处理、数据科学和机器学习任务,提供多语言API(Scala、Java、Python、R)及分布式计算能力。该镜像适用于在ARM64架构环境中快速部署和运行Spark应用,简化大数据处理流程。
spark-shell、pyspark、sparkR),便于实时数据分析和调试| 标签 | 对应的Dockerfile链接 |
|---|---|
4.0.0-scala2.13-java21-python3-ubuntu, 4.0.0-java21-python3, 4.0.0-java21, python3, latest | Dockerfile |
4.0.0-scala2.13-java21-r-ubuntu, 4.0.0-java21-r | Dockerfile |
4.0.0-scala2.13-java21-ubuntu, 4.0.0-java21-scala | Dockerfile |
4.0.0-scala2.13-java21-python3-r-ubuntu | Dockerfile |
4.0.0-scala2.13-java17-python3-ubuntu, 4.0.0-python3, 4.0.0, python3-java17 | Dockerfile |
4.0.0-scala2.13-java17-r-ubuntu, 4.0.0-r, r | Dockerfile |
4.0.0-scala2.13-java17-ubuntu, 4.0.0-scala, scala | Dockerfile |
4.0.0-scala2.13-java17-python3-r-ubuntu | Dockerfile |
3.5.7-scala2.12-java17-python3-ubuntu, 3.5.7-java17-python3, 3.5.7-java17 | Dockerfile |
3.5.7-scala2.12-java17-r-ubuntu, 3.5.7-java17-r | Dockerfile |
3.5.7-scala2.12-java17-ubuntu, 3.5.7-java17-scala | Dockerfile |
3.5.7-scala2.12-java17-python3-r-ubuntu | Dockerfile |
3.5.7-scala2.12-java11-python3-ubuntu, 3.5.7-python3, 3.5.7 | Dockerfile |
3.5.7-scala2.12-java11-r-ubuntu, 3.5.7-r | Dockerfile |
3.5.7-scala2.12-java11-ubuntu, 3.5.7-scala | Dockerfile |
3.5.7-scala2.12-java11-python3-r-ubuntu | Dockerfile |
通过Scala Shell快速开始Spark交互:
bashdocker run -it arm64v8/spark /opt/spark/bin/spark-shell
示例命令(返回1,000,000,000):
scalascala> spark.range(1000 * 1000 * 1000).count()
使用Python Shell需指定python3标签:
bashdocker run -it arm64v8/spark:python3 /opt/spark/bin/pyspark
示例命令:
python>>> spark.range(1000 * 1000 * 1000).count()
使用R Shell需指定r标签:
bashdocker run -it arm64v8/spark:r /opt/spark/bin/sparkR
启动Spark Master节点:
bashdocker run -d \ --name spark-master \ -p 7077:7077 \ -p 8080:8080 \ arm64v8/spark \ /opt/spark/bin/spark-class org.apache.spark.deploy.master.Master
启动Spark Worker节点(连接到Master):
bashdocker run -d \ --name spark-worker \ --link spark-master:master \ arm64v8/spark \ /opt/spark/bin/spark-class org.apache.spark.deploy.worker.Worker spark://master:7077
创建docker-compose.yml:
yamlversion: '3' services: master: image: arm64v8/spark container_name: spark-master ports: - "7077:7077" # Master节点端口 - "8080:8080" # Web UI端口 command: /opt/spark/bin/spark-class org.apache.spark.deploy.master.Master worker: image: arm64v8/spark container_name: spark-worker depends_on: - master environment: - SPARK_MASTER=spark://master:7077 command: /opt/spark/bin/spark-class org.apache.spark.deploy.worker.Worker ${SPARK_MASTER}
启动集群:
bashdocker-compose up -d
Spark支持在Kubernetes上部署,详细文档参见官方指南。
镜像支持通过环境变量自定义Spark配置,常见配置项及说明参见Apache Spark Docker镜像文档。关键环境变量包括:
SPARK_HOME:Spark安装路径(默认/opt/spark)SPARK_MASTER:Master节点地址(如spark://master:7077)SPARK_WORKER_CORES:Worker节点可用CPU核心数SPARK_WORKER_MEMORY:Worker节点可用内存(如4g)Apache Spark及其Docker镜像基于Apache License 2.0许可。镜像可能包含其他软件(如基础系统工具),其许可需由用户自行确认合规性。
更多许可信息参见repo-info仓库的spark目录。


manifest unknown 错误
TLS 证书验证失败
DNS 解析超时
410 错误:版本过低
402 错误:流量耗尽
身份认证失败错误
429 限流错误
凭证保存错误
来自真实用户的反馈,见证轩辕镜像的优质服务