gazelle93/Transformer-Various-Positional-Encoding
This project aims to implement the Transformer Encoder blocks using various Positional Encoding methods.
This project helps machine learning engineers and researchers implement Transformer Encoder blocks for natural language processing tasks. It takes raw text as input and processes it through various positional encoding methods (Absolute, Shaw et al., Raffel et al.) to produce tokenized results, attention scores, and hidden states. The output is used to build and experiment with advanced language models.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher developing advanced natural language processing models and need to experiment with different positional encoding techniques within Transformer architectures.
Not ideal if you are looking for an out-of-the-box, production-ready NLP application or a high-level library for general text analysis.
Stars
24
Forks
2
Language
Python
License
—
Category
Last pushed
Jun 11, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/gazelle93/Transformer-Various-Positional-Encoding"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
xv44586/toolkit4nlp
transformers implement (architecture, task example, serving and more)
luozhouyang/transformers-keras
Transformer-based models implemented in tensorflow 2.x(using keras).
ufal/neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
graykode/xlnet-Pytorch
Simple XLNet implementation with Pytorch Wrapper
uzaymacar/attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language...