AmeyaJagtap/Adaptive_Activation_Functions

We proposed the simple adaptive activation functions deep neural networks. The proposed method is simple and easy to implement in any neural networks architecture.

37
/ 100
Emerging

This project helps researchers and engineers build more accurate and efficient deep neural networks, especially for solving complex scientific and engineering problems. You provide your problem, often involving partial differential equations or complex data, and it delivers a neural network that learns faster and provides more precise solutions than traditional methods. This is for anyone working with deep learning to model physical systems or analyze intricate datasets.

No commits in the last 6 months.

Use this if you are developing deep learning models for regression tasks or solving partial differential equations and need to improve your model's convergence speed and accuracy, particularly for forward problems.

Not ideal if you are primarily focused on classification tasks or if your current fixed-activation neural networks already meet your performance needs.

scientific computing computational physics numerical simulation engineering modeling data regression
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

13

Forks

6

Language

License

MIT

Last pushed

Feb 01, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/AmeyaJagtap/Adaptive_Activation_Functions"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.