Rishit-dagli/GLU

An easy-to-use library for GLU (Gated Linear Units) and GLU variants in TensorFlow.

37
/ 100
Emerging

This library provides various Gated Linear Units (GLU) activation functions, like GLU, SwiGLU, and GEGLU, for use within TensorFlow models. It takes your model's layers and applies these specific activation functions, helping improve performance on tasks like language understanding. This is for machine learning engineers and researchers building and training deep learning models.

No commits in the last 6 months.

Use this if you are developing deep learning models in TensorFlow and want to experiment with advanced GLU-based activation functions to potentially improve model accuracy and training efficiency.

Not ideal if you are not working with TensorFlow or are looking for a high-level, no-code solution for general machine learning tasks.

deep-learning natural-language-processing model-optimization neural-network-design machine-learning-engineering
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

20

Forks

4

Language

Python

License

Apache-2.0

Last pushed

Feb 22, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Rishit-dagli/GLU"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.