NERSC/dl4sci25-dl-at-scale
Deep learning for science school material 2025
This project provides practical, hands-on examples for scientists and researchers to train deep learning models efficiently on powerful supercomputing systems like NERSC's Perlmutter. It demonstrates how to use atmospheric data to train advanced models for tasks like weather forecasting, showcasing techniques to optimize training speed and handle large datasets. The material is designed for scientific domain experts looking to scale their deep learning applications.
No commits in the last 6 months.
Use this if you are a researcher or scientist who needs to train complex deep learning models using large scientific datasets and high-performance computing resources.
Not ideal if you are looking for a simple, local deep learning setup for small datasets or if you are not working with high-performance computing environments.
Stars
19
Forks
5
Language
Python
License
—
Category
Last pushed
Jun 26, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/NERSC/dl4sci25-dl-at-scale"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
deepspeedai/DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference...
helmholtz-analytics/heat
Distributed tensors and Machine Learning framework with GPU and MPI acceleration in Python
hpcaitech/ColossalAI
Making large AI models cheaper, faster and more accessible
horovod/horovod
Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
bsc-wdc/dislib
The Distributed Computing library for python implemented using PyCOMPSs programming model for HPC.