jaketae/g-mlp
PyTorch implementation of Pay Attention to MLPs
This project offers an alternative to traditional transformer models for AI developers working on large-scale language processing or image recognition tasks. It takes text sequences or image data as input and provides processed outputs for tasks like language modeling or image classification. Developers and machine learning engineers can use this to build and experiment with powerful, attention-free deep learning models.
No commits in the last 6 months.
Use this if you are a machine learning engineer looking to implement or experiment with state-of-the-art, attention-free neural network architectures for language or vision tasks.
Not ideal if you are an end-user without deep learning development experience looking for an out-of-the-box application.
Stars
40
Forks
7
Language
Python
License
MIT
Category
Last pushed
Jun 28, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/jaketae/g-mlp"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...