To build RAG-enabled GenAI models with Confluent, Flink & MongoDB, Netflix aims to create a robust and scalable financial application that can provide critical insights for planning, budgeting, and tracking content spending. The Content Finance Engineering Team uses a microservices-driven approach, modeling every financial application as a separate service. However, this leads to potential issues with data consistency and availability, particularly when dealing with complex event exchanges and synchronous request-based interactions. To address these challenges, Netflix adopts an event-driven architecture using Apache Kafka as the de-facto standard for messaging and stream processing. The platform enables asynchronous communication, promotes decoupling, and provides traceability as a first-class citizen. With Kafka's high durability and linear scalability, Netflix can process large amounts of data from various sources, including databases and external services. By leveraging Confluent Schema Registry and Apache Avro, the company defines schemas for its output streams to ensure backward compatibility and versioning. Additionally, Netflix uses Spring Boot and EVCache to implement idempotent behavior and guarantee exactly once delivery in its distributed system. The platform also provides a real-time view of service levels within the infrastructure, enabling the company to monitor key metrics and visualize dimensional time series data with Atlas. By adopting this event-driven architecture, Netflix can efficiently manage its financial application, provide critical insights for content spending, and drive business decisions.