lancopku/meSimp
Codes for "Training Simplification and Model Simplification for Deep Learning: A Minimal Effort Back Propagation Method"
This project helps machine learning engineers and researchers reduce the size and computational cost of their neural networks. By applying a "minimal effort" backpropagation method during training, it takes a standard neural network model and outputs a significantly smaller, yet equally or more accurate, version. This enables faster training and decoding in real-world applications.
No commits in the last 6 months.
Use this if you are training neural networks (like MLPs or LSTMs) and need to reduce their size and computational footprint without sacrificing accuracy, or even improving it.
Not ideal if you are not working with neural networks or if you are looking for a solution that does not involve modifying the training process itself.
Stars
18
Forks
3
Language
C#
License
—
Category
Last pushed
May 13, 2018
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/lancopku/meSimp"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
google/langfun
OO for LLMs
tanaos/artifex
Small Language Model Inference, Fine-Tuning and Observability. No GPU, no labeled data needed.
preligens-lab/textnoisr
Adding random noise to a text dataset, and controlling very accurately the quality of the result
vulnerability-lookup/VulnTrain
A tool to generate datasets and models based on vulnerabilities descriptions from @Vulnerability-Lookup.
masakhane-io/masakhane-mt
Machine Translation for Africa