DeathReaper0965/distributed-deeplearning
End to End Distributed Deep Learning Engine, works both with Streaming and Batch Data built using Apache Flink
This is an end-to-end pipeline designed for building and managing deep learning systems that can process data continuously as it arrives (streaming) or in large batches. It takes raw streaming or batch data, processes it, and then feeds it into deep learning models to generate predictions. Data scientists, machine learning engineers, and data engineers can use this to deploy and manage predictive models.
No commits in the last 6 months.
Use this if you need a robust system to continuously ingest data, run deep learning models for real-time predictions, and store the results for further analysis.
Not ideal if you are looking for a simple, single-machine deep learning model training solution without distributed processing or real-time streaming requirements.
Stars
10
Forks
1
Language
Java
License
Apache-2.0
Category
Last pushed
Aug 30, 2020
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/DeathReaper0965/distributed-deeplearning"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lensacom/sparkit-learn
PySpark + Scikit-learn = Sparkit-learn
Angel-ML/angel
A Flexible and Powerful Parameter Server for large-scale machine learning
flink-extended/dl-on-flink
Deep Learning on Flink aims to integrate Flink and deep learning frameworks (e.g. TensorFlow,...
MingChen0919/learning-apache-spark
Notes on Apache Spark (pyspark)
mahmoudparsian/data-algorithms-book
MapReduce, Spark, Java, and Scala for Data Algorithms Book