airflow and argo-workflows
These are competitors offering different approaches to workflow orchestration: Airflow is a Python-based DAG scheduler that can run on various infrastructure, while Argo Workflows is a Kubernetes-native engine that submits containerized tasks directly to K8s clusters.
About airflow
apache/airflow
Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
Airflow helps you automate and manage sequences of tasks, known as workflows, that run regularly. You provide the steps of your workflow as code, and Airflow ensures they run in the right order, on schedule, and lets you track their progress. This is for data engineers, DevOps specialists, or anyone who needs to reliably orchestrate complex data pipelines or automated processes.
About argo-workflows
argoproj/argo-workflows
Workflow Engine for Kubernetes
Argo Workflows helps you automate and manage complex, multi-step tasks like machine learning pipelines, data processing, or infrastructure setup within a Kubernetes environment. You provide the individual steps as containers, and it orchestrates their execution, allowing you to visualize and track progress. This tool is ideal for platform engineers, MLOps specialists, or data engineers who need to run parallel, compute-intensive jobs reliably.
Scores updated daily from GitHub, PyPI, and npm data. How scores work