zjunlp/Revisit-KNN
[CCL 2023] Revisiting k-NN for Fine-tuning Pre-trained Language Models
This project helps machine learning engineers and researchers improve how well their language models classify text. It takes a pre-trained language model and textual data, then applies a k-Nearest Neighbors (k-NN) approach during training. The outcome is a more accurate and robust text classification model, particularly useful for tasks like sentiment analysis, question answering, and information extraction, even with limited data.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher looking to enhance the accuracy and robustness of your pre-trained language models for various text classification and information extraction tasks, especially when working with limited training data.
Not ideal if you are a business user without machine learning experience or if you need a plug-and-play solution without fine-tuning language models.
Stars
10
Forks
1
Language
Python
License
MIT
Category
Last pushed
Jun 18, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/zjunlp/Revisit-KNN"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
n-waves/multifit
The code to reproduce results from paper "MultiFiT: Efficient Multi-lingual Language Model...
princeton-nlp/SimCSE
[EMNLP 2021] SimCSE: Simple Contrastive Learning of Sentence Embeddings https://arxiv.org/abs/2104.08821
yxuansu/SimCTG
[NeurIPS'22 Spotlight] A Contrastive Framework for Neural Text Generation
alibaba-edu/simple-effective-text-matching
Source code of the ACL2019 paper "Simple and Effective Text Matching with Richer Alignment Features".
Shark-NLP/OpenICL
OpenICL is an open-source framework to facilitate research, development, and prototyping of...