zjunlp/Revisit-KNN

[CCL 2023] Revisiting k-NN for Fine-tuning Pre-trained Language Models

28
/ 100
Experimental

This project helps machine learning engineers and researchers improve how well their language models classify text. It takes a pre-trained language model and textual data, then applies a k-Nearest Neighbors (k-NN) approach during training. The outcome is a more accurate and robust text classification model, particularly useful for tasks like sentiment analysis, question answering, and information extraction, even with limited data.

No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher looking to enhance the accuracy and robustness of your pre-trained language models for various text classification and information extraction tasks, especially when working with limited training data.

Not ideal if you are a business user without machine learning experience or if you need a plug-and-play solution without fine-tuning language models.

text-classification natural-language-processing information-extraction language-model-fine-tuning machine-learning-research
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 7 / 25

How are scores calculated?

Stars

10

Forks

1

Language

Python

License

MIT

Last pushed

Jun 18, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/zjunlp/Revisit-KNN"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.