ORNL/flowcept

Runtime provenance for AI and scientific workflows—capture, enrich, and query workflow data via observability adapters and code annotation across edge, cloud, and HPC.

47
/ 100
Emerging

This tool helps scientists, machine learning engineers, and data analysts track and understand the execution of their complex AI and scientific workflows. It captures detailed records of what data went into each step, what came out, and how long each part took, even across different computing environments like local machines, cloud, or supercomputers. The output is a clear, human-readable report summarizing the workflow's structure and performance, making it easier to debug, reproduce, and audit results.

Use this if you need to automatically monitor and audit the steps and data flow within your scientific simulations, machine learning training, or data processing pipelines, especially across distributed systems.

Not ideal if you are only running simple, one-off scripts where tracking every input and output is unnecessary, or if you primarily need to monitor infrastructure metrics rather than workflow execution details.

scientific-workflow-management machine-learning-operations data-pipeline-auditing distributed-computing-monitoring research-reproducibility
No Package No Dependents
Maintenance 10 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

14

Forks

7

Language

Python

License

MIT

Last pushed

Feb 21, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mlops/ORNL/flowcept"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.