Long-term memory management is crucial for creating effective AI agents, as it involves incorporating information not produced in the current interaction into an AI's prompt. This article discusses how Gel's trigger system can streamline long-term memory management by handling tasks in the background, illustrated through a custom multi-agent chatbot application. The application employs three specialized agents: the Talker agent, which interacts with users; the Extractor agent, which extracts user information and feedback from chat history; and the Summarizer agent, which compresses chat history and generates conversation titles. The system uses a combination of FastAPI, Pydantic AI, and Streamlit for its architecture, with Gel serving as the database to manage the background tasks via triggers. This setup allows for efficient handling of tasks like information extraction, chat history summarization, and chat title generation without adding latency to the user-agent interactions, ultimately resulting in a more responsive and personalized experience for users.