kyegomez/M2PT
Implementation of M2PT in PyTorch from the paper: "Multimodal Pathway: Improve Transformers with Irrelevant Data from Other Modalities"
This project helps machine learning engineers and researchers improve the performance of their existing transformer models by integrating information from other data types, even if that data is normally considered 'irrelevant.' It takes as input an existing transformer model and linear layers from other models, and outputs a refined transformer model with enhanced capabilities. This is for professionals building advanced AI models that process multiple forms of data, such as text and images.
No commits in the last 6 months. Available on PyPI.
Use this if you are developing AI models that process different types of data (like text and images) and want to boost your model's accuracy by cleverly incorporating insights from auxiliary data sources.
Not ideal if you are looking for a complete, out-of-the-box multimodal AI model, as this project provides a technique for enhancing existing models rather than a standalone solution.
Stars
14
Forks
1
Language
Python
License
MIT
Category
Last pushed
Mar 11, 2024
Commits (30d)
0
Dependencies
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/kyegomez/M2PT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
adapter-hub/adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning
gaussalgo/adaptor
ACL 2022: Adaptor: a library to easily adapt a language model to your own task, domain, or...
ylsung/VL_adapter
PyTorch code for "VL-Adapter: Parameter-Efficient Transfer Learning for Vision-and-Language...
intersun/LightningDOT
source code and pre-trained/fine-tuned checkpoint for NAACL 2021 paper LightningDOT
calpt/awesome-adapter-resources
Collection of Tools and Papers related to Adapters / Parameter-Efficient Transfer Learning/ Fine-Tuning