jaketae/realformer
PyTorch implementation of RealFormer: Transformer Likes Residual Attention
This project provides a specialized building block for machine learning engineers working with large language models. It takes in numerical representations of text data and processes them using an improved attention mechanism. The output is a more refined numerical representation that can be used for tasks like language understanding or text generation, aiming for better performance than standard Transformer models. This is for developers and researchers building advanced NLP systems.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher looking to improve the performance of your Transformer-based models on natural language processing tasks.
Not ideal if you are a non-developer seeking a ready-to-use application for text analysis or if you prefer a complete, pre-trained model for your NLP needs.
Stars
11
Forks
7
Language
Python
License
MIT
Category
Last pushed
May 17, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/jaketae/realformer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...