ManashJKonwar/NLP-Transformers
Transformer (BERT, GPT2, etc.) based Training Module for popular NLP tasks
This project helps machine learning engineers or data scientists build custom natural language processing (NLP) models. You provide text data in a specific format, and it trains transformer models like BERT or GPT2 to perform tasks such as identifying names, classifying text, or summarizing documents. The output is a trained model ready for deployment in your applications.
No commits in the last 6 months.
Use this if you need to create specialized AI models that understand and process human language for a variety of tasks, and you are comfortable working with Python and PyTorch.
Not ideal if you're looking for an out-of-the-box solution with a graphical interface, or if you don't have experience with machine learning model training.
Stars
9
Forks
2
Language
Python
License
MIT
Category
Last pushed
Nov 20, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ManashJKonwar/NLP-Transformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
LoicGrobol/zeldarose
Train transformer-based models.
CPJKU/wechsel
Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of...
yuanzhoulvpi2017/zero_nlp
中文nlp解决方案(大模型、数据、模型、训练、推理)
minggnim/nlp-models
A repository for training transformer based models
IntelLabs/nlp-architect
A model library for exploring state-of-the-art deep learning topologies and techniques for...