JosselinSomervilleRoberts/BERT-Multitask-learning
Multitask-learning of a BERT backbone. Allows to easily train a BERT model with state-of-the-art method such as PCGrad, Gradient Vaccine, PALs, Scheduling, Class imbalance handling and many optimizations
This project helps machine learning practitioners efficiently train a single BERT model to perform multiple text analysis tasks simultaneously, such as classifying sentiment, detecting paraphrases, and measuring semantic similarity. It takes raw text data as input and outputs predictions for these tasks, alongside training logs and model checkpoints. This is designed for ML engineers or data scientists working with natural language processing tasks who need to optimize model training.
No commits in the last 6 months.
Use this if you are developing NLP applications and need to train a single BERT-based model for multiple, related text understanding tasks efficiently and with state-of-the-art training techniques.
Not ideal if you are looking for a ready-to-use API for sentiment analysis, paraphrase detection, or semantic similarity without delving into model training and optimization.
Stars
20
Forks
3
Language
Python
License
Apache-2.0
Category
Last pushed
Oct 08, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/JosselinSomervilleRoberts/BERT-Multitask-learning"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
codertimo/BERT-pytorch
Google AI 2018 BERT pytorch implementation
JayYip/m3tl
BERT for Multitask Learning
920232796/bert_seq2seq
pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持t5模型,支持GPT2进行文章续写。
sileod/tasknet
Easy modernBERT fine-tuning and multi-task learning
graykode/toeicbert
TOEIC(Test of English for International Communication) solving using pytorch-pretrained-BERT model.