digantamisra98/Mish
Official Repository for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]
Mish is a tool for machine learning practitioners and researchers who build deep learning models. It provides a unique 'activation function' which is a core component within neural networks. By integrating Mish into their models, users can enhance the learning process, leading to improved performance on various tasks such as image recognition or natural language processing.
1,303 stars. Actively maintained with 8 commits in the last 30 days.
Use this if you are developing or training deep neural networks and want to explore advanced activation functions to improve model accuracy and training stability.
Not ideal if you are looking for a complete end-to-end machine learning solution or are not working directly with neural network architecture design.
Stars
1,303
Forks
128
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Mar 09, 2026
Commits (30d)
8
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/digantamisra98/Mish"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
Sentdex/nnfs_book
Sample code from the Neural Networks from Scratch book.
itdxer/neupy
NeuPy is a Tensorflow based python library for prototyping and building neural networks
vzhou842/cnn-from-scratch
A Convolutional Neural Network implemented from scratch (using only numpy) in Python.
nicklashansen/rnn_lstm_from_scratch
How to build RNNs and LSTMs from scratch with NumPy.
Synthaze/EpyNN
Educational python for Neural Networks.