- Notifications
You must be signed in to change notification settings - Fork3.5k
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
License
microsoft/onnxruntime
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
ONNX Runtime is a cross-platform inference and training machine-learning accelerator.
ONNX Runtime inference can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms.Learn more →
ONNX Runtime training can accelerate the model training time on multi-node NVIDIA GPUs for transformer models with a one-line addition for existing PyTorch training scripts.Learn more →
General Information:onnxruntime.ai
Usage documentation and tutorials:onnxruntime.ai/docs
YouTube video tutorials:youtube.com/@ONNXRuntime
Companion sample repositories:
- ONNX Runtime Inferencing:microsoft/onnxruntime-inference-examples
- ONNX Runtime Training:microsoft/onnxruntime-training-examples
The current release and past releases can be found here:https://github.com/microsoft/onnxruntime/releases.
For details on the upcoming release, including release dates, announcements, features, and guidance on submitting feature requests, please visit the release roadmap:https://onnxruntime.ai/roadmap.
Windows distributions of this project may collect usage data and send it to Microsoft to help improve our products and services. See theprivacy statement for more details.
We welcome contributions! Please see thecontribution guidelines.
For feature requests or bug reports, please file aGitHub Issue.
For general discussion or questions, please useGitHub Discussions.
This project has adopted theMicrosoft Open Source Code of Conduct.For more information see theCode of Conduct FAQor contactopencode@microsoft.com with any additional questions or comments.
This project is licensed under theMIT License.
About
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
Topics
Resources
License
Code of conduct
Contributing
Security policy
Uh oh!
There was an error while loading.Please reload this page.
