mourga/contrastive-active-learning
Code for the EMNLP 2021 Paper "Active Learning by Acquiring Contrastive Examples" & the ACL 2022 Paper "On the Importance of Effectively Adapting Pretrained Language Models for Active Learning"
This project helps machine learning practitioners in Natural Language Processing (NLP) efficiently train text classification models by intelligently selecting the most informative data to label. It takes unlabeled text data for tasks like sentiment analysis or topic classification and outputs a high-performing model with less human effort in data annotation. The primary users are ML engineers or researchers working on NLP applications who need to optimize data labeling costs.
128 stars. No commits in the last 6 months.
Use this if you need to train accurate NLP models for tasks like sentiment analysis or topic classification with limited labeled data and want to reduce the cost and time spent on manual data annotation.
Not ideal if you already have abundant labeled data for your NLP task or if your task is outside of text classification.
Stars
128
Forks
13
Language
Python
License
GPL-3.0
Category
Last pushed
May 24, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/mourga/contrastive-active-learning"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
AdaptiveMotorControlLab/CEBRA
Learnable latent embeddings for joint behavioral and neural analysis - Official implementation of CEBRA
theolepage/sslsv
Toolkit for training and evaluating Self-Supervised Learning (SSL) frameworks for Speaker...
PaddlePaddle/PASSL
PASSL包含 SimCLR,MoCo v1/v2,BYOL,CLIP,PixPro,simsiam, SwAV, BEiT,MAE 等图像自监督算法以及 Vision...
YGZWQZD/LAMDA-SSL
30 Semi-Supervised Learning Algorithms
ModSSC/ModSSC
ModSSC: A Modular Framework for Semi Supervised Classification