avacaondata/nlpboost
Python library for automatic training, optimization and comparison of Transformer models on most NLP tasks.
This tool helps NLP practitioners efficiently train and compare multiple Transformer models across various datasets. You provide your text data and desired models, and it automates the hyperparameter tuning and evaluation process. The output includes performance metrics and comparison plots, helping data scientists and NLP researchers quickly identify the best model for their specific natural language processing tasks.
No commits in the last 6 months. Available on PyPI.
Use this if you need to rapidly experiment with different Transformer models and datasets for NLP tasks like classification or named entity recognition, without spending extensive time on manual coding and configuration.
Not ideal if you prefer to hand-code every aspect of your model training and hyperparameter search, or if your NLP task falls outside common categories like classification, NER, or seq2seq.
Stars
20
Forks
3
Language
Python
License
MIT
Category
Last pushed
May 06, 2023
Commits (30d)
0
Dependencies
31
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/avacaondata/nlpboost"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
farach/huggingfaceR
Hugging Face state-of-the-art models in R
DengBoCong/nlp-paper
自然语言处理领域下的相关论文(附阅读笔记),复现模型以及数据处理等(代码含TensorFlow和PyTorch两版本)
xiangking/ark-nlp
A private nlp coding package, which quickly implements the SOTA solutions.
IDEA-CCNL/GTS-Engine
GTS Engine: A powerful NLU Training...
adapter-hub/Hub
ARCHIVED. Please use https://docs.adapterhub.ml/huggingface_hub.html || 🔌 A central repository...