JosselinSomervilleRoberts/BERT-Multitask-learning

Multitask-learning of a BERT backbone. Allows to easily train a BERT model with state-of-the-art method such as PCGrad, Gradient Vaccine, PALs, Scheduling, Class imbalance handling and many optimizations

34
/ 100
Emerging

This project helps machine learning practitioners efficiently train a single BERT model to perform multiple text analysis tasks simultaneously, such as classifying sentiment, detecting paraphrases, and measuring semantic similarity. It takes raw text data as input and outputs predictions for these tasks, alongside training logs and model checkpoints. This is designed for ML engineers or data scientists working with natural language processing tasks who need to optimize model training.

No commits in the last 6 months.

Use this if you are developing NLP applications and need to train a single BERT-based model for multiple, related text understanding tasks efficiently and with state-of-the-art training techniques.

Not ideal if you are looking for a ready-to-use API for sentiment analysis, paraphrase detection, or semantic similarity without delving into model training and optimization.

Natural Language Processing Machine Learning Engineering Text Analytics Sentiment Analysis Semantic Similarity
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 12 / 25

How are scores calculated?

Stars

20

Forks

3

Language

Python

License

Apache-2.0

Last pushed

Oct 08, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/JosselinSomervilleRoberts/BERT-Multitask-learning"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.