Based on Python
Python is used to describe ETL/ELT processes. And anyone with knowledge of Python will find using Airflow easy.
A small but full-fledged toolkit
Great for creating and managing data processing processes. Working with AirFlow is possible using CLI, REST API and a web interface created on the basis of the Flask Python framework.
AirFlow supports many databases (MySQL, PostgreSQL, DynamoDB, Hive), big data storage (HDFS, Amazon S3), and cloud platforms (Google Cloud platform, Amazon Web Services, Microsoft Azure).
An extensible REST API
Makes it relatively easy to integrate Airflow into an existing enterprise IT landscape and flexibly customize data pipelines.
Monitoring and alerting
Integration with Statsd and FluentD is supported for collecting and sending metrics and logs.
AirFlow provides 5 roles with different access levels: Admin, Public, Viewer, Op, User. Integration with Active Directory and flexible access configuration using RBAC are possible.
It’s possible to use basic unit tests to test pipelines and specific tasks in them.
Airflow is scalable due to its modular architecture and message queue for an unlimited number of DAGs
AirFlow is actively maintained by the community and has well documented documentation.
What is Luigi?
Luigi is a Python framework for building complex sequences of dependent tasks. A fairly large part of the framework is aimed at transforming data from various sources (MySQL, MongoDB, Redis) and using various tools (from starting a process to executing tasks of various types on a Hadoop cluster).Learn more
What is Dagster?
Dagster is an orchestrator that’s designed for developing and maintaining data assets, such as tables, data sets, machine learning models, and reports.Learn more
Meets the tasks 7
Tool functionality 9
Apache Airflow is an advanced workflow manager and an indispensable tool in the arsenal of a modern data engineer. If you look at open vacancies for the position of data engineer, you will often find experience with Airflow as one of the requirements for the position.
Meets the tasks 5
Tool functionality 8
If you look at open vacancies for the position of data engineer, you will often find experience with Airflow as one of the requirements for the position.