apache/airflow

Apache Airflow - A platform to programmatically author, schedule, and monitor workflows

85
/ 100
Verified

Airflow helps you automate and manage sequences of tasks, known as workflows, that run regularly. You provide the steps of your workflow as code, and Airflow ensures they run in the right order, on schedule, and lets you track their progress. This is for data engineers, DevOps specialists, or anyone who needs to reliably orchestrate complex data pipelines or automated processes.

44,620 stars. Used by 3 other packages. Actively maintained with 711 commits in the last 30 days. Available on PyPI.

Use this if you need to schedule, monitor, and manage workflows that are mostly static and run on a recurring basis, such as daily data imports or report generation.

Not ideal if your workflows change constantly or require very dynamic, real-time adjustments between steps.

data-pipeline-orchestration workflow-automation ETL-management job-scheduling data-operations
Maintenance 22 / 25
Adoption 13 / 25
Maturity 25 / 25
Community 25 / 25

How are scores calculated?

Stars

44,620

Forks

16,685

Language

Python

License

Apache-2.0

Last pushed

Mar 13, 2026

Commits (30d)

711

Dependencies

2

Reverse dependents

3

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mlops/apache/airflow"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.