sovit-123/attention_is_all_you_need
Implementation of language model papers along with several examples [NOT ALL WRITTEN FROM SCRATCH].
This project helps machine learning engineers and researchers implement transformer models for various natural language processing tasks. It takes raw text data as input and produces trained models for text classification, text generation, and other sequence-to-sequence problems. It's designed for those who want to experiment with or apply the 'Attention is All You Need' architecture.
No commits in the last 6 months.
Use this if you are a machine learning practitioner looking for an existing implementation of the transformer architecture to use or adapt for NLP tasks.
Not ideal if you are looking for a plug-and-play solution without any coding or deep understanding of neural networks.
Stars
12
Forks
—
Language
Python
License
—
Category
Last pushed
Oct 02, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/sovit-123/attention_is_all_you_need"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...