SebChw/Actually-Robust-Training

Actually Robust Training - Tool Inspired by Andrej Karpathy "Recipe for training neural networks". It allows you to decompose your Deep Learning pipeline into modular and insightful "Steps". Additionally it has many features for testing and debugging neural nets.

24
/ 100
Experimental

When training deep neural networks with PyTorch, this tool helps you follow best practices and ensures your model training process is robust and explainable. It breaks down your deep learning pipeline into a series of modular steps, allowing you to feed in your data and model to get a well-debugged and reproducible training outcome. This is for machine learning engineers, data scientists, and deep learning practitioners who build and train neural networks.

No commits in the last 6 months.

Use this if you are training deep learning models with PyTorch and want to ensure correctness, reproducibility, and robust debugging throughout your experiment lifecycle.

Not ideal if you are not working with PyTorch for deep learning or are looking for a no-code solution for model training.

deep-learning neural-network-training model-debugging machine-learning-engineering pytorch-workflows
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 0 / 25

How are scores calculated?

Stars

43

Forks

Language

Python

License

MIT

Last pushed

Apr 13, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/SebChw/Actually-Robust-Training"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.