LoicGrobol/zeldarose
Train transformer-based models.
This tool helps developers efficiently train custom transformer-based language models. You provide raw text files, with one sentence per line, to train both a tokenizer and a transformer model. The output is a trained model and tokenizer that can then be used for various natural language processing tasks. It's designed for machine learning engineers and researchers working with large text datasets.
Available on PyPI.
Use this if you need a straightforward way to train transformer models, especially for masked language modeling, using existing frameworks like Hugging Face's transformers.
Not ideal if you are a beginner looking for a no-code solution or if your primary goal is fine-tuning an existing model without custom pre-training.
Stars
28
Forks
3
Language
Python
License
—
Category
Last pushed
Jan 23, 2026
Commits (30d)
0
Dependencies
18
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/LoicGrobol/zeldarose"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
CPJKU/wechsel
Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of...
yuanzhoulvpi2017/zero_nlp
中文nlp解决方案(大模型、数据、模型、训练、推理)
minggnim/nlp-models
A repository for training transformer based models
IntelLabs/nlp-architect
A model library for exploring state-of-the-art deep learning topologies and techniques for...
MahmoudWahdan/dialog-nlu
Tensorflow and Keras implementation of the state of the art researches in Dialog System NLU