WD-Leong/NLP-Attention-Free-Transformer

An Attention Free Transformer without Self-Attention mechanism in PyTorch.

27
/ 100
Experimental

This project helps machine learning engineers and researchers explore and implement an Attention-Free Transformer (AFT) for natural language processing tasks. It takes movie dialog data as input and produces a trained model capable of generating conversational replies. This is ideal for those building or experimenting with sequence-to-sequence models for text generation.

No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher interested in an alternative to traditional self-attention mechanisms for building conversational AI models.

Not ideal if you are looking for a ready-to-use chatbot for end-users, or if you need state-of-the-art performance without implementing and training models yourself.

natural-language-processing conversational-ai machine-learning-research text-generation deep-learning-models
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 8 / 25
Community 14 / 25

How are scores calculated?

Stars

9

Forks

3

Language

Python

License

Last pushed

Dec 16, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/WD-Leong/NLP-Attention-Free-Transformer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.