alon-albalak/FLAD
Few-shot Learning with Auxiliary Data
This project helps machine learning practitioners improve the performance of their natural language processing models, especially when they only have a small amount of labeled data for a specific task. By leveraging additional, related datasets (auxiliary data), it allows you to train more generalizable models. It takes text-based auxiliary and target datasets and outputs a fine-tuned language model that performs better on the target task. This is for AI/ML researchers and practitioners working on language models who need to achieve high performance with limited task-specific data.
No commits in the last 6 months.
Use this if you are working with few-shot natural language processing tasks and have access to various auxiliary datasets that could potentially help improve your model's generalization.
Not ideal if you are not working with language models, or if you do not have access to any auxiliary data to supplement your few-shot learning.
Stars
31
Forks
3
Language
Python
License
—
Category
Last pushed
Dec 08, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/alon-albalak/FLAD"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
jakesnell/prototypical-networks
Code for the NeurIPS 2017 Paper "Prototypical Networks for Few-shot Learning"
harveyslash/Facial-Similarity-with-Siamese-Networks-in-Pytorch
Implementing Siamese networks with a contrastive loss for similarity learning
oscarknagg/few-shot
Repository for few-shot learning machine learning projects
google-research/meta-dataset
A dataset of datasets for learning to learn from few examples
Sha-Lab/FEAT
The code repository for "Few-Shot Learning via Embedding Adaptation with Set-to-Set Functions"