WD-Leong/NLP-Attention-Free-Transformer
An Attention Free Transformer without Self-Attention mechanism in PyTorch.
This project helps machine learning engineers and researchers explore and implement an Attention-Free Transformer (AFT) for natural language processing tasks. It takes movie dialog data as input and produces a trained model capable of generating conversational replies. This is ideal for those building or experimenting with sequence-to-sequence models for text generation.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher interested in an alternative to traditional self-attention mechanisms for building conversational AI models.
Not ideal if you are looking for a ready-to-use chatbot for end-users, or if you need state-of-the-art performance without implementing and training models yourself.
Stars
9
Forks
3
Language
Python
License
—
Category
Last pushed
Dec 16, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/WD-Leong/NLP-Attention-Free-Transformer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...