airflow and argo-workflows

These are competitors offering different approaches to workflow orchestration: Airflow is a Python-based DAG scheduler that can run on various infrastructure, while Argo Workflows is a Kubernetes-native engine that submits containerized tasks directly to K8s clusters.

airflow
85
Verified
argo-workflows
73
Verified
Maintenance 22/25
Adoption 13/25
Maturity 25/25
Community 25/25
Maintenance 22/25
Adoption 10/25
Maturity 16/25
Community 25/25
Stars: 44,620
Forks: 16,685
Downloads:
Commits (30d): 711
Language: Python
License: Apache-2.0
Stars: 16,517
Forks: 3,494
Downloads:
Commits (30d): 74
Language: Go
License: Apache-2.0
No risk flags
No Package No Dependents

About airflow

apache/airflow

Apache Airflow - A platform to programmatically author, schedule, and monitor workflows

Airflow helps you automate and manage sequences of tasks, known as workflows, that run regularly. You provide the steps of your workflow as code, and Airflow ensures they run in the right order, on schedule, and lets you track their progress. This is for data engineers, DevOps specialists, or anyone who needs to reliably orchestrate complex data pipelines or automated processes.

data-pipeline-orchestration workflow-automation ETL-management job-scheduling data-operations

About argo-workflows

argoproj/argo-workflows

Workflow Engine for Kubernetes

Argo Workflows helps you automate and manage complex, multi-step tasks like machine learning pipelines, data processing, or infrastructure setup within a Kubernetes environment. You provide the individual steps as containers, and it orchestrates their execution, allowing you to visualize and track progress. This tool is ideal for platform engineers, MLOps specialists, or data engineers who need to run parallel, compute-intensive jobs reliably.

MLOps data-engineering workflow-orchestration CI/CD infrastructure-automation

Scores updated daily from GitHub, PyPI, and npm data. How scores work