Maintained with ☕️ by
IcePanel logo

Two new models for agentic coding and efficient AI are now available in Amazon SageMaker JumpStart

Share

Services

Today, AWS announced the availability of GLM-5.1-FP8 and Phi-4-mini-instruct in Amazon SageMaker JumpStart, expanding the portfolio of foundation models available to AWS customers. These models from Z.ai and Microsoft bring advanced agentic capabilities and efficient inference to enterprise AI workloads on AWS infrastructure. These models address different enterprise AI challenges with specialized capabilities:**GLM-5.1-FP8** excels at agentic software engineering with sustained multi-round optimization, handling repository-level code generation, terminal tasks, and complex debugging workflows that improve with extended reasoning. It is ideal for automated code review pipelines, AI-powered development environments, and long-horizon problem-solving where the model iterates over hundreds of rounds to refine solutions.**Phi-4-mini-instruct** excels at strong reasoning, math, and logic in memory-constrained and latency-bound environments, supporting 24 languages and function calling in a compact form factor. It is ideal for edge deployment, latency-sensitive applications, multilingual chatbots, and scenarios where customers need capable reasoning with minimal resource overhead. With SageMaker JumpStart, customers can deploy any of these models with just a few clicks to address their specific AI use cases. To get started with these models, navigate to the Models section of SageMaker Studio or use the SageMaker Python SDK to deploy the models to your AWS account. For more information about deploying and using foundation models in SageMaker JumpStart, see the [](https://docs.aws.amazon.com/sagemaker/latest/dg/studio-jumpstart.html)[](https://docs.aws.amazon.com/sagemaker/latest/dg/studio-jumpstart.html)Amazon SageMaker JumpStart documentation.