Change Data Capture (CDC) is a process crucial for synchronizing operational data stores (ODS) with data warehouses (DWH), enabling strategic decisions based on the most current data. Popularized by Bill Inmon, CDC involves identifying and tracking record-level changes in data, and its implementation varies in frequency and method, including inter-day, intra-day, or real-time updates. This blog post, part one of a two-part series, explores the necessity and role of CDC in modern data stacks, highlighting its evolution from batch-mode data warehousing to real-time synchronization across various databases and non-DWH data stores using tools like Kafka and FiveTran. Apache Airflow is presented as a key tool for managing data pipelines that utilize CDC, allowing users to author, schedule, and monitor data workflows with Python, thus facilitating a flexible, scalable environment for various business use cases. Examples illustrate how CDC can be applied to propagate changes from an ODS to a DWH, using Airflow to implement both full sync and incremental sync strategies for data, ensuring the target databases remain updated and aligned with source systems.