Amazon Bedrock now supports batch inference
Share
Services
You can now use Amazon Bedrock to process prompts in batch to get responses for model evaluation, experimentation, and offline processing.
Using batch API makes it more efficient to run inference with foundation models (FMs). It also allows you to aggregate responses and analyze them in batches.
Batch processing is generally available in all regions where Amazon Bedrock is available. For more information, see [AWS Services by Region](https://aws.amazon.com/about-aws/global-infrastructure/regional-product-services/).
To learn more about batch inference in Amazon Bedrock, see [Amazon Bedrock API reference](https://docs.aws.amazon.com/bedrock/latest/APIReference/welcome.html). Pricing for batch mode is the same as pricing for On-Demand mode. For details, see the [Amazon Bedrock pricing page](https://aws.amazon.com/bedrock/pricing/).
What else is happening at Amazon Web Services?
Read update
Services
Share
Read update
Services
Share
Amazon Connect now supports push notifications for mobile chat
about 4 hours ago
Services
Share
Amazon EC2 M8g instances now available in AWS Europe (Spain)
about 8 hours ago
Services
Share
Amazon Keyspaces (for Apache Cassandra) now supports User-Defined Types in AWS GovCloud (US) Regions
about 8 hours ago
Services
Share