Open Source ONNX Runtime
Share
Services
ONNX Runtime, a high-performance inference engine for machine learning models in the [Open Neural Network Exchange](https://onnx.ai/) (ONNX) format is now being open sourced. ONNX Runtime is compatible with ONNX version 1.2 and comes in Python packages that support both CPU and GPU inferencing. With the release of the open source ONNX Runtime project, developers have the freedom to customise and integrate the ONNX inference engine into their existing infrastructure directly from the source code, as well as compile and build it on a variety of operating systems.
**What is the ONNX format?**
[Open Neural Network Exchange](https://onnx.ai/) (ONNX) is the basis of an open ecosystem of interoperability and innovation in the AI ecosystem that Microsoft co-developed to make AI more accessible and valuable to all. An open format to represent machine-learning models, ONNX enables AI developers to choose the right framework for their task and hardware vendors to streamline optimisations.
To learn more about ONNX Runtime, [read the blog](https://aka.ms/onnx-rt-os)
* Azure Machine Learning
* Microsoft Connect
What else is happening at Microsoft Azure?
Read update
Services
Share
Generally Available: Storage account default maximum request rate limit increase to 40,000 requests per second
December 12th, 2024
Services
Share
Read update
Services
Share
Generally Available: Regional Disaster Recovery by Azure Backup for AKS
November 22nd, 2024
Services
Share
Generally Available: Enhancements on Azure Container Storage for performance, scalability, and operational insights
November 19th, 2024
Services
Share