https://github.com/loong64/onnxruntime
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://github.com/loong64/onnxruntime
ai-framework deep-learning hardware-acceleration loong64 loongarch64 machine-learning neural-networks onnx pytorch scikit-learn tensorflow
Last synced: 3 months ago
JSON representation
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
- Host: GitHub
- URL: https://github.com/loong64/onnxruntime
- Owner: loong64
- License: mit
- Created: 2025-05-29T14:10:53.000Z (4 months ago)
- Default Branch: main
- Last Pushed: 2025-05-29T14:17:46.000Z (4 months ago)
- Last Synced: 2025-05-29T15:51:11.758Z (4 months ago)
- Topics: ai-framework, deep-learning, hardware-acceleration, loong64, loongarch64, machine-learning, neural-networks, onnx, pytorch, scikit-learn, tensorflow
- Language: Dockerfile
- Homepage: https://github.com/microsoft/onnxruntime
- Size: 3.91 KB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
**ONNX Runtime is a cross-platform inference and training machine-learning accelerator**.
**ONNX Runtime inference** can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms. [Learn more →](https://www.onnxruntime.ai/docs/#onnx-runtime-for-inferencing)
**ONNX Runtime training** can accelerate the model training time on multi-node NVIDIA GPUs for transformer models with a one-line addition for existing PyTorch training scripts. [Learn more →](https://www.onnxruntime.ai/docs/#onnx-runtime-for-training)
## Get Started & Resources
* **General Information**: [onnxruntime.ai](https://onnxruntime.ai)
* **Usage documentation and tutorials**: [onnxruntime.ai/docs](https://onnxruntime.ai/docs)
* **YouTube video tutorials**: [youtube.com/@ONNXRuntime](https://www.youtube.com/@ONNXRuntime)
* [**Upcoming Release Roadmap**](https://onnxruntime.ai/roadmap)
* **Companion sample repositories**:
- ONNX Runtime Inferencing: [microsoft/onnxruntime-inference-examples](https://github.com/microsoft/onnxruntime-inference-examples)
- ONNX Runtime Training: [microsoft/onnxruntime-training-examples](https://github.com/microsoft/onnxruntime-training-examples)## Builtin Pipeline Status
|System|Inference|Training|
|---|---|---|
|Windows|[](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=9)
[](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=218)
[](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=47)
[](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=228)||
|Linux|[](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=11)
[](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=64)
[](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=12)
[](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=45)
[](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=55)|[](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=86)
[](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=84)|
|Mac|[](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=13)||
|Android|[](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=53)||
|iOS|[](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=134)||
|Web|[](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=161)||
|Other|[](https://dev.azure.com/onnxruntime/onnxruntime/_build/latest?definitionId=187&repoName=microsoft%2Fonnxruntime)||This project is tested with [BrowserStack](https://www.browserstack.com/home).
## Third-party Pipeline Status
|System|Inference|Training|
|---|---|---|
|Linux|[](https://github.com/Ascend/onnxruntime/actions/workflows/build-and-test.yaml)||## Releases
The current release and past releases can be found here: https://github.com/microsoft/onnxruntime/releases.
For details on the upcoming release, including release dates, announcements, features, and guidance on submitting feature requests, please visit the release roadmap: https://onnxruntime.ai/roadmap.
## Data/Telemetry
Windows distributions of this project may collect usage data and send it to Microsoft to help improve our products and services. See the [privacy statement](docs/Privacy.md) for more details.
## Contributions and Feedback
We welcome contributions! Please see the [contribution guidelines](CONTRIBUTING.md).
For feature requests or bug reports, please file a [GitHub Issue](https://github.com/Microsoft/onnxruntime/issues).
For general discussion or questions, please use [GitHub Discussions](https://github.com/microsoft/onnxruntime/discussions).
## Code of Conduct
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/)
or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.## License
This project is licensed under the [MIT License](LICENSE).