Apache Airflow has evolved significantly from its early days at Airbnb in 2014 to become a versatile tool used for orchestrating complex data workflows, not just traditional ETL processes. Despite persistent misconceptions, Airflow supports various scheduling options beyond time-based ones, such as event-driven and asset-based scheduling, which cater to modern data orchestration needs. While it is not designed for stream processing, Airflow effectively complements streaming tools like Kafka by managing the lifecycle of streaming jobs. Criticisms that it is unsuitable for machine learning or AI workflows overlook the fact that it has become increasingly popular for these purposes, thanks to features like dynamic task mapping and a dedicated AI SDK. Far from being legacy technology, Airflow has been actively developed, with numerous major releases and an expanding contributor base, reflecting its adaptability and growing use across diverse industries and applications.