maxpumperla/elephas

Distributed Deep learning with Keras & Spark

51
/ 100
Established

This project helps data scientists and machine learning engineers train deep learning models much faster by distributing the workload across a cluster of machines. You provide your Keras deep learning model and large datasets, and it leverages Apache Spark to process the data in parallel, producing a trained model ready for predictions. This is ideal for those working with massive datasets that overwhelm a single machine.

1,578 stars. No commits in the last 6 months.

Use this if you are a data scientist or machine learning engineer struggling with long training times for Keras deep learning models on very large datasets.

Not ideal if your datasets are small enough to be handled efficiently on a single machine without distributed computing.

deep-learning distributed-training big-data-analytics machine-learning-engineering model-training
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 25 / 25

How are scores calculated?

Stars

1,578

Forks

309

Language

Python

License

MIT

Last pushed

May 01, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/maxpumperla/elephas"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.