Maintained with ☕️ by
IcePanel logo

AWS announces vector search for Amazon MemoryDB for Redis (Preview)

Share

Services

[Amazon MemoryDB for Redis](https://aws.amazon.com/memorydb/) now supports vector search in preview, a new capability that enables you to store, index, and search vectors. MemoryDB is a database that combines in-memory performance with multi-AZ durability. With vector search for MemoryDB, you can develop real-time machine learning (ML) and generative AI applications with the highest performance demands using the popular, open-source Redis API. Vector search for MemoryDB supports storing millions of vectors, with single-digit millisecond query and update response times, and tens of thousands queries per second (QPS) at greater than 99% recall. You can generate vector embeddings using AI/ML services like Amazon Bedrock and SageMaker, and store them within MemoryDB. By using the preview of vector search for MemoryDB, you can build applications demonstrating high throughput at high recall ratios, with single-digit millisecond vector query and update latencies. For example, a bank can use vector search for MemoryDB to detect anomalies such as fraudulent transactions during periods of high transactional volumes, with minimal false positives. Vector search for MemoryDB is available in preview in the US East (N. Virginia), US East (Ohio), Europe (Ireland), US West (Oregon), and Asia Pacific (Tokyo) regions at no additional cost. To get started, create a new MemoryDB cluster using Amazon MemoryDB for Redis version 7.1 and enable the vector search preview through the AWS Management Console or Command Line Interface (CLI). Learn more about vector search for Amazon MemoryDB in the [documentation.](https://docs.aws.amazon.com/memorydb/latest/devguide/vector-search.html)