saturncloud/dask-pytorch-ddp
dask-pytorch-ddp is a Python package that makes it easy to train PyTorch models on dask clusters using distributed data parallel.
This tool helps machine learning engineers and data scientists train PyTorch models more efficiently using large datasets, particularly for computer vision tasks. It takes your existing PyTorch model and training script, along with large image datasets stored in cloud storage like S3, and enables distributed training across multiple machines. The output is a trained PyTorch model, but achieved much faster than with a single machine.
No commits in the last 6 months. Available on PyPI.
Use this if you need to train PyTorch models on very large datasets that don't fit on a single machine, or if you want to speed up training by utilizing multiple GPUs or machines.
Not ideal if your dataset is small enough to train efficiently on a single GPU or CPU, or if you are not already using PyTorch for your deep learning tasks.
Stars
59
Forks
10
Language
Python
License
BSD-3-Clause
Category
Last pushed
Apr 05, 2021
Commits (30d)
0
Dependencies
4
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/saturncloud/dask-pytorch-ddp"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
pclubiitk/model-zoo
Implementations of various Deep Learning models in PyTorch and TensorFlow.
neuralmagic/deepsparse
Sparsity-aware deep learning inference runtime for CPUs
theairbend3r/how-to-train-your-neural-net
Deep learning research implemented on notebooks using PyTorch.
neuralmagic/sparsezoo
Neural network model repository for highly sparse and sparse-quantized models with matching...
ekinakyurek/KnetLayers.jl
Useful Layers for Knet