Generally available: Model Serving on Azure Databricks
Share
Services
Model Serving on Azure Databricks is now generally available
Model Serving on Azure Databricks is now generally available. Azure Databricks Model Serving deploys machine learning models as a REST API, allowing you to build real-time ML applications, such as personalised recommendations, customer service chatbots, fraud detection and more – all without the hassle of managing serving infrastructure. The first feature to launch under this model is Serverless Real-Time Inference.
This feature uses [Serverless compute](https://learn.microsoft.com/en-us/azure/databricks/serverless-compute/), which means that the endpoints and associated compute resources are managed and run in Azure Databricks. [Read more](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-inference/serverless/serverless-real-time-inference)
* Azure Databricks
* Pricing & Offerings
* Services
* [ Azure Databricks](https://azure.microsoft.com/en-gb/products/databricks/)
What else is happening at Microsoft Azure?
Read update
Services
Share
Generally Available: Storage account default maximum request rate limit increase to 40,000 requests per second
December 12th, 2024
Services
Share
Read update
Services
Share
Generally Available: Regional Disaster Recovery by Azure Backup for AKS
November 22nd, 2024
Services
Share
Generally Available: Enhancements on Azure Container Storage for performance, scalability, and operational insights
November 19th, 2024
Services
Share