sunyilgdx/Prompts4Keras
Prompt-learning methods used BERT4Keras (PET, EFL and NSP-BERT), both for Chinese and English.
This project helps machine learning engineers and researchers quickly set up and compare prompt-learning methods for text classification, especially in situations where very little data is available (few-shot learning). It takes pre-trained BERT or RoBERTa models and your text datasets (English or Chinese), applying techniques like PET, EFL, and NSP-BERT to produce classified text outputs. This is useful for those working on natural language processing tasks who need to efficiently evaluate different prompt-learning approaches.
No commits in the last 6 months.
Use this if you are an NLP researcher or machine learning engineer focused on text classification with limited data, and you want to experiment with or benchmark prompt-learning methods on both English and Chinese text.
Not ideal if you are looking for a simple, out-of-the-box solution for large-scale text classification tasks without an interest in comparative prompt-learning research, or if you prefer frameworks other than TensorFlow/Keras.
Stars
30
Forks
—
Language
Python
License
Apache-2.0
Category
Last pushed
Oct 12, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/sunyilgdx/Prompts4Keras"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
codertimo/BERT-pytorch
Google AI 2018 BERT pytorch implementation
JayYip/m3tl
BERT for Multitask Learning
920232796/bert_seq2seq
pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持t5模型,支持GPT2进行文章续写。
sileod/tasknet
Easy modernBERT fine-tuning and multi-task learning
graykode/toeicbert
TOEIC(Test of English for International Communication) solving using pytorch-pretrained-BERT model.