bhavnicksm/vanilla-transformer-jax

JAX/Flax implimentation of 'Attention Is All You Need' by Vaswani et al. (https://arxiv.org/abs/1706.03762)

45
/ 100
Emerging

This is a foundational building block for artificial intelligence developers working on advanced natural language processing. It takes raw text or sequences, processes them using a specific neural network architecture, and outputs numerical representations that can be used for tasks like translation, summarization, or text generation. It's intended for AI researchers and machine learning engineers building and experimenting with cutting-edge models.

No commits in the last 6 months. Available on PyPI.

Use this if you are an AI developer who needs a robust, ready-to-use implementation of the original Transformer architecture in JAX/Flax for research or building new NLP applications.

Not ideal if you are looking for a pre-trained model for immediate use or if you are not comfortable working with machine learning frameworks like JAX/Flax.

natural-language-processing machine-learning-engineering deep-learning-research neural-network-development text-generation
Stale 6m
Maintenance 0 / 25
Adoption 6 / 25
Maturity 25 / 25
Community 14 / 25

How are scores calculated?

Stars

15

Forks

3

Language

Python

License

MIT

Last pushed

Aug 16, 2021

Commits (30d)

0

Dependencies

3

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/bhavnicksm/vanilla-transformer-jax"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.