Amazon Bedrock now supports batch inference
Share
Services
You can now use Amazon Bedrock to process prompts in batch to get responses for model evaluation, experimentation, and offline processing.
Using batch API makes it more efficient to run inference with foundation models (FMs). It also allows you to aggregate responses and analyze them in batches.
Batch processing is generally available in all regions where Amazon Bedrock is available. For more information, see [AWS Services by Region](https://aws.amazon.com/about-aws/global-infrastructure/regional-product-services/).
To learn more about batch inference in Amazon Bedrock, see [Amazon Bedrock API reference](https://docs.aws.amazon.com/bedrock/latest/APIReference/welcome.html). Pricing for batch mode is the same as pricing for On-Demand mode. For details, see the [Amazon Bedrock pricing page](https://aws.amazon.com/bedrock/pricing/).
What else is happening at Amazon Web Services?
Amazon AppStream 2.0 users can now save their user preferences between streaming sessions
December 13th, 2024
Services
Share
AWS Elemental MediaConnect Gateway now supports source-specific multicast
December 13th, 2024
Services
Share
Amazon EC2 instances support bandwidth configurations for VPC and EBS
December 13th, 2024
Services
Share
AWS announces new AWS Direct Connect location in Osaka, Japan
December 13th, 2024
Services
Share
Amazon DynamoDB announces support for FIPS 140-3 interface VPC and Streams endpoints
December 13th, 2024
Services
Share