kyegomez/MGQA
The open source implementation of the multi grouped query attention by the paper "GQA: Training Generalized Multi-Query Transformer Models from Multi-Head Checkpoints"
This project offers a specialized transformer model designed to improve how large language models process information. It takes text or other sequential data as input and produces more efficient and focused representations, specifically for those building or fine-tuning transformer-based AI models. It's for machine learning engineers and researchers working on optimizing transformer performance for tasks like natural language processing or sequence modeling.
Used by 1 other package. No commits in the last 6 months. Available on PyPI.
Use this if you are a machine learning engineer or researcher looking to improve the memory and computational efficiency of your transformer models by using a 'grouped query attention' mechanism.
Not ideal if you are an end-user of AI models, a data scientist focused on traditional machine learning, or you do not work directly with transformer architecture design.
Stars
15
Forks
1
Language
Python
License
MIT
Category
Last pushed
Dec 11, 2023
Commits (30d)
0
Dependencies
3
Reverse dependents
1
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kyegomez/MGQA"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
kyegomez/RT-X
Pytorch implementation of the models RT-1-X and RT-2-X from the paper: "Open X-Embodiment:...
kyegomez/PALI3
Implementation of PALI3 from the paper PALI-3 VISION LANGUAGE MODELS: SMALLER, FASTER, STRONGER"
chuanyangjin/MMToM-QA
[🏆Outstanding Paper Award at ACL 2024] MMToM-QA: Multimodal Theory of Mind Question Answering
lyuchenyang/Macaw-LLM
Macaw-LLM: Multi-Modal Language Modeling with Image, Video, Audio, and Text Integration
Muennighoff/vilio
🥶Vilio: State-of-the-art VL models in PyTorch & PaddlePaddle