IvanBongiorni/maximal

A TensorFlow-compatible Python library that provides models and layers to implement custom Transformer neural networks. Built on TensorFlow 2.

38
/ 100
Emerging

This library helps machine learning engineers and researchers build custom Transformer neural networks. It provides foundational components like attention mechanisms, positional embeddings, and complete Transformer encoder and GPT blocks. Users can integrate these modular layers into their TensorFlow 2 Keras models to create specialized large language or vision models.

No commits in the last 6 months. Available on PyPI.

Use this if you are a machine learning engineer or researcher looking to design and implement your own Transformer-based models within the TensorFlow 2 ecosystem, leveraging pre-built, flexible components.

Not ideal if you need a complete, pre-trained Transformer model for immediate use without custom architectural design, or if you are working outside of TensorFlow 2.

deep-learning neural-network-architecture natural-language-processing-research computer-vision-research model-development
Stale 6m No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 25 / 25
Community 8 / 25

How are scores calculated?

Stars

9

Forks

1

Language

Python

License

MIT

Last pushed

Oct 29, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/IvanBongiorni/maximal"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.