The BBC's Datalab team, led by Principal Data Engineer Tatiana Al-Chueyr Pereira Martins, implemented Apache Airflow to address challenges in delivering personalized experiences through machine learning, software engineering, data engineering, and DevOps. Before adopting Airflow, the team struggled with a lack of standardized ETL processes and orchestration tools, and had difficulty building machine learning pipelines. Alternatives considered, such as PubSub, Luigi, and TensorFlow Extended, did not meet their needs. Airflow's Python compatibility and flexibility allowed the team to create workflow templates and achieve a configuration-driven approach, significantly accelerating their machine learning model development. Despite initial resistance due to tight deadlines, the successful use of Airflow inspired other BBC teams to adopt it. The tool enabled the creation of a centralized platform for workflow visualization and issue identification, facilitating code reuse and system monitoring. Airflow's open-source nature offers media companies like the BBC flexibility and resilience in adapting to technological changes without overhauling their entire data stack.