Apache Airflow is now officially joining the ranks of the many other Top-Level Projects at the Apache Software Foundation (ASF). Apache Airflow is a workflow automation and scheduling system for managing Big Data pipelines.
By graduating from incubation to become a Top-Level Project, Apache Airflow has proven that it and its community has been well-governed under the ASF’s processes and principles, the ASF explained.
According to the ASF, Apache Airflow is currently being used by over 200 organizations, including Adobe, Etsy, Google, and Twitter. It was initially created by Airbnb in 2014, and then submitted to the Apache Incubator in March 2016.
The project is able to run tasks written in various language, which allows it to integrate well with common architectures and projects, such as AWS S3, Docker, and Apache Hadoop HDFS, the ASF explained.
“Since its inception, Apache Airflow has quickly become the de-facto standard for workflow orchestration,” said Bolke de Bruin, vice president of Apache Airflow. “Airflow has gained adoption among developers and data scientists alike thanks to its focus on configuration-as-code. That has gained us a community during incubation at the ASF that not only uses Apache Airflow but also contributes back. This reflects Airflow’s ease of use, scalability, and power of our diverse community; that it is embraced by enterprises and start-ups alike, allows us to now graduate to a Top-Level Project.”
Features include smart scheduling, database and dependency management, error handling, logging, and resource management.