fabienfrfr/tptt

😊 TPTT: Transforming Pretrained Transformers into Titans

29
/ 100
Experimental

This library helps AI researchers and machine learning engineers enhance existing large language models (LLMs) by injecting more efficient attention mechanisms. It takes a pre-trained transformer model, such as those from Hugging Face, and integrates specialized 'linearized attention' modules. The output is a more memory-efficient and potentially faster-performing transformer model after a lightweight fine-tuning step.

Use this if you are a researcher or engineer looking to improve the efficiency and performance of existing large language models by integrating advanced memory mechanisms without extensive retraining.

Not ideal if you are looking for a plug-and-play solution for end-user applications or if you are not comfortable with fine-tuning transformer models.

large-language-models deep-learning natural-language-processing model-optimization AI-research
No Package No Dependents
Maintenance 6 / 25
Adoption 8 / 25
Maturity 15 / 25
Community 0 / 25

How are scores calculated?

Stars

60

Forks

Language

Python

License

Apache-2.0

Last pushed

Nov 24, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/fabienfrfr/tptt"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.