Maintained with ☕️ by
IcePanel logo

AWS announces vector search for Amazon DocumentDB



[Amazon DocumentDB]( (with MongoDB compatibility) now supports vector search, a new capability that enables you to store, index, and search millions of vectors with millisecond response times. Vectors are numerical representations of unstructured data, such as text, created from [machine learning]( (ML) models that help capture the semantic meaning of the underlying data. Vector search for Amazon DocumentDB can store vectors from [Amazon Bedrock](, [Amazon SageMaker](, and more. There are no upfront commitments or additional costs to use vector search, and you only pay for the data you store and compute resources you use. With vector search for Amazon DocumentDB, you can simply set up, operate, and scale databases for your ML, including [generative AI]( enabled applications. You no longer have to spend time managing separate vector infrastructure, writing code to connect with another service, and duplicating data from your source database. The vector search capability together with large language models (LLMs) enable you to search the database based on meaning, unlocking a wide range of use cases, including semantic search, product recommendations, personalization, and chatbots. Vector search for Amazon DocumentDB is available on DocumentDB 5.0 instance-based clusters in all [regions]( where Amazon DocumentDB is available. You can get started by launching an Amazon DocumentDB cluster directly from the [AWS Console]( or the [AWS CLI]( Learn more about vector search on our [features page]( and [developer guide](