AmirhosseinHonardoust/Noise-Injection-Techniques

Noise Injection Techniques provides a comprehensive exploration of methods to make machine learning models more robust to real-world bad data. This repository explains and demonstrates Gaussian noise, dropout, mixup, masking, adversarial noise, and label smoothing, with intuitive explanations, theory, and practical code examples.

25
/ 100
Experimental

This guide helps data scientists, ML engineers, and researchers make their machine learning models more reliable and less sensitive to messy or 'bad' data. It provides methods to intentionally introduce controlled noise into training data or models, transforming fragile models into resilient systems. You'll learn techniques to improve how models perform when faced with real-world imperfections like sensor glitches, typos, or missing values.

Use this if you are a data scientist, ML engineer, or researcher struggling with models that perform well in controlled environments but fail when deployed with real, imperfect data.

Not ideal if your dataset is extremely clean and stable, or if you are working with very simple linear models where noise injection offers limited benefits.

machine-learning-robustness model-generalization data-quality-improvement deep-learning-training predictive-modeling
No Package No Dependents
Maintenance 6 / 25
Adoption 6 / 25
Maturity 13 / 25
Community 0 / 25

How are scores calculated?

Stars

22

Forks

Language

License

MIT

Last pushed

Nov 15, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/AmirhosseinHonardoust/Noise-Injection-Techniques"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.