kyegomez/DifferentialTransformer

An open source community implementation of the model from "DIFFERENTIAL TRANSFORMER" paper by Microsoft.

33
/ 100
Emerging

This project offers an implementation of a 'Differential Transformer' model, designed to improve the quality of machine learning models that process sequential data like text or time series. It takes raw input data (e.g., text sequences) and produces cleaner, more focused representations by reducing irrelevant 'noise'. This is for machine learning engineers and researchers building advanced AI models.

Use this if you are developing transformer-based models and need to enhance their performance by reducing 'attention noise' in sequential data processing.

Not ideal if you are looking for an out-of-the-box application to solve a specific business problem rather than a component for model development.

natural-language-processing machine-learning-engineering AI-model-development neural-network-optimization
No Package No Dependents
Maintenance 10 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 0 / 25

How are scores calculated?

Stars

39

Forks

Language

Python

License

MIT

Last pushed

Mar 09, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kyegomez/DifferentialTransformer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.