Beomi/easy-lm-trainer

πŸ€— μ΅œμ†Œν•œμ˜ μ„ΈνŒ…μœΌλ‘œ LM을 ν•™μŠ΅ν•˜κΈ° μœ„ν•œ μƒ˜ν”Œμ½”λ“œ

28
/ 100
Experimental

This project helps machine learning engineers and researchers quickly set up and train large language models using the Hugging Face Transformers library. You provide a dataset, and it outputs a trained language model ready for downstream tasks like text generation or summarization. It's designed for those who want to get a basic causal language model (CLM) training run operational with minimal initial configuration.

No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher looking for a straightforward boilerplate to start training a causal language model with Hugging Face Transformers.

Not ideal if you need to fine-tune a model for a highly specialized task requiring complex custom training loops or advanced distributed training strategies beyond what Deepspeed offers out-of-the-box.

natural-language-processing large-language-models model-training machine-learning-engineering
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 8 / 25
Community 12 / 25

How are scores calculated?

Stars

59

Forks

7

Language

Python

License

Last pushed

May 23, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Beomi/easy-lm-trainer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.