Zep, a long-term memory store for LLM applications, offers developers the ability to integrate relevant documents and chat history into LLM app prompts through features like document and chat history storage, embedding, and enrichment. The article highlights the use of Zep’s Document Vector Store and ZepVectorStore for LlamaIndex, illustrating the process of creating and managing document collections with hybrid semantic search capabilities. Zep supports integration with various frameworks, such as LangChain and LlamaIndex, and can be installed via Docker or Kubernetes. Collections in Zep store document text, embeddings, and metadata, enabling semantic searches filtered by metadata through JSONPath queries. The article describes how to set up a ZepVectorStore, create indices using LlamaIndex patterns, and perform hybrid searches with metadata filters, illustrating this with a practical example of querying astronomical data. Overall, Zep and LlamaIndex together provide a seamless way to manage and query large datasets, enhancing the functionality of LLM applications.