openeuler/onnxruntimeThe official ONNX Runtime docker image.
Maintained by: openEuler CloudNative SIG.
Where to get help: openEuler CloudNative SIG, openEuler.
Current ONNX Runtime docker images are built on the openEuler. This repository is free to use and exempted from per-user rate limits.
ONNX Runtime is a cross-platform inference and training machine-learning accelerator. ONNX Runtime inference can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms.
Learn more about on ONNX Runtime Website.
The tag of each ONNX Runtime docker image is consist of the complete software stack version. The details are as follows
| Tag | Currently | Architectures |
|---|---|---|
| 1.22.1-oe2403sp2 | ONNX Runtime 1.22.1 on openEuler 24.03-LTS-SP2 | amd64, arm64 |
Ensure that you have Docker installed, or are using Docker for Linux containers if on Windows.
Obtain the ONNX Runtime docker image. There are two ways to do this:
Pull the pre-built Docker image from DockerHub
docker pull openeuler/onnxruntimeClone the source repository. Navigate to the AI/onnxruntime folder and build the image locally with the following command.
docker build . -t openeuler/onnxruntimedocker run -it openeuler/onnxruntimeIf you have any questions or want to use some special features, please submit an issue or a pull request on openeuler-docker-images.
manifest unknown 错误
TLS 证书验证失败
DNS 解析超时
410 错误:版本过低
402 错误:流量耗尽
身份认证失败错误
429 限流错误
凭证保存错误
来自真实用户的反馈,见证轩辕镜像的优质服务