PrithivirajDamodaran/Alt-ZSC
Alternate Implementation for Zero Shot Text Classification: Instead of reframing NLI/XNLI, this reframes the text backbone of CLIP models to do ZSC. Hence, can be lightweight + supports more languages without trading-off accuracy. (Super simple, a 10th-grader could totally write this but since no 10th-grader did, I did) - Prithivi Da
This tool helps you quickly categorize text snippets into predefined categories, even if you haven't explicitly trained a model on those categories. You input a piece of text and a list of potential labels, and it outputs the text along with the most likely labels and their scores. This is ideal for anyone who needs to sort or understand unstructured text data, such as market researchers, content moderators, or customer service analysts.
No commits in the last 6 months.
Use this if you need to classify text into categories without extensive training data, especially across multiple languages, and value a lightweight, efficient solution.
Not ideal if your text classification requires highly specialized, nuanced categories that are not well-represented by common language patterns.
Stars
37
Forks
5
Language
Python
License
MIT
Category
Last pushed
Apr 05, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/PrithivirajDamodaran/Alt-ZSC"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
airaria/TextBrewer
A PyTorch-based knowledge distillation toolkit for natural language processing
sunyilgdx/NSP-BERT
The code for our paper "NSP-BERT: A Prompt-based Zero-Shot Learner Through an Original...
kssteven418/LTP
[KDD'22] Learned Token Pruning for Transformers
princeton-nlp/CoFiPruning
[ACL 2022] Structured Pruning Learns Compact and Accurate Models https://arxiv.org/abs/2204.00408
georgian-io/Transformers-Domain-Adaptation
:no_entry: [DEPRECATED] Adapt Transformer-based language models to new text domains