jaketae/realformer

PyTorch implementation of RealFormer: Transformer Likes Residual Attention

37
/ 100
Emerging

This project provides a specialized building block for machine learning engineers working with large language models. It takes in numerical representations of text data and processes them using an improved attention mechanism. The output is a more refined numerical representation that can be used for tasks like language understanding or text generation, aiming for better performance than standard Transformer models. This is for developers and researchers building advanced NLP systems.

No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher looking to improve the performance of your Transformer-based models on natural language processing tasks.

Not ideal if you are a non-developer seeking a ready-to-use application for text analysis or if you prefer a complete, pre-trained model for your NLP needs.

natural-language-processing machine-learning-engineering deep-learning-research language-model-development text-embedding
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

11

Forks

7

Language

Python

License

MIT

Last pushed

May 17, 2021

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/jaketae/realformer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.