Maintained with ☕️ by
IcePanel logo

Amazon Bedrock now supports batch inference

Share

Services

You can now use Amazon Bedrock to process prompts in batch to get responses for model evaluation, experimentation, and offline processing. Using batch API makes it more efficient to run inference with foundation models (FMs). It also allows you to aggregate responses and analyze them in batches. Batch processing is generally available in all regions where Amazon Bedrock is available. For more information, see [AWS Services by Region](https://aws.amazon.com/about-aws/global-infrastructure/regional-product-services/). To learn more about batch inference in Amazon Bedrock, see [Amazon Bedrock API reference](https://docs.aws.amazon.com/bedrock/latest/APIReference/welcome.html). Pricing for batch mode is the same as pricing for On-Demand mode. For details, see the [Amazon Bedrock pricing page](https://aws.amazon.com/bedrock/pricing/).