Maintained with ☕️ by
IcePanel logo

Amazon Neptune Now Integrated with Zep to Power Long-Term Memory for GenAI Applications

Share

Services

Today, we’re announcing the integration of Amazon Neptune with Zep, an open-source memory server for LLM applications. Zep enables developers to persist, retrieve, and enrich user interaction history, providing long-term memory and context for AI agents. With this launch, customers can now use Neptune Database or Neptune Analytics as the underlying graph store and Amazon Open Search as the text-search store for Zep’s memory system, enabling graph-powered memory retrieval and reasoning. This integration makes it easier to build LLM agents with long-term memory, context, and reasoning. Zep users can now store and query memory graphs at scale, unlocking multi-hop reasoning and hybrid retrieval across graph, vector, and keyword modalities. By combining Zep’s memory orchestration with Neptune’s graph-native knowledge representation, developers can build more personalized, context-aware, and intelligent LLM applications. Zep helps applications remember user interactions, extract structured knowledge, and reason across memory—making it easier to build LLM agents that improve over time. To learn more about the Neptune–Zep integration, check [the sample Notebook](https://github.com/getzep/graphiti/blob/main/examples/quickstart/quickstart%5Fneptune.py).