Generally available: Model Serving on Azure Databricks
Share
Services
Model Serving on Azure Databricks is now generally available
Model Serving on Azure Databricks is now generally available. Azure Databricks Model Serving deploys machine learning models as a REST API, allowing you to build real-time ML applications, such as personalised recommendations, customer service chatbots, fraud detection and more – all without the hassle of managing serving infrastructure. The first feature to launch under this model is Serverless Real-Time Inference.
This feature uses [Serverless compute](https://learn.microsoft.com/en-us/azure/databricks/serverless-compute/), which means that the endpoints and associated compute resources are managed and run in Azure Databricks. [Read more](https://learn.microsoft.com/en-us/azure/databricks/machine-learning/model-inference/serverless/serverless-real-time-inference)
* Azure Databricks
* Pricing & Offerings
* Services
* [ Azure Databricks](https://azure.microsoft.com/en-gb/products/databricks/)
What else is happening at Microsoft Azure?
Read update
Services
Share
Read update
Services
Share
We’re retiring Azure Time Series Insights on 7 July 2024 – transition to Azure Data Explorer
May 31st, 2024
Services
Share