MrGoriay/pwlu-pytorch

Unofficial pytorch implementation of Piecewise Linear Unit dynamic activation function

14
/ 100
Experimental

This project offers a specialized activation function for neural networks, called Piecewise Linear Unit (PWLU), that helps AI researchers and deep learning engineers create more flexible and efficient deep learning models. It takes in neural network layer outputs during model training and inference, and produces refined activations that can better approximate complex functions, enhancing model performance.

No commits in the last 6 months.

Use this if you are developing or experimenting with deep neural networks and want to explore a dynamic activation function that offers computational efficiency and strong approximation capabilities.

Not ideal if you are looking for a plug-and-play solution with extensive testing and validation across diverse real-world datasets, as this implementation is still under active development.

deep-learning-research neural-network-design machine-learning-engineering model-optimization artificial-intelligence-development
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 8 / 25
Community 0 / 25

How are scores calculated?

Stars

18

Forks

Language

Python

License

Last pushed

Feb 08, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/MrGoriay/pwlu-pytorch"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.