gyunggyung/LFM2-KoEn-Tuning
Fine-tuning LFM2-1.2B for Korean-English bidirectional translation. GRPO+COMET & SFT Training, outperforming 4B models.
37
/ 100
Emerging
No Package
No Dependents
Maintenance
6 / 25
Adoption
4 / 25
Maturity
13 / 25
Community
14 / 25
Stars
7
Forks
3
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Jan 04, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/gyunggyung/LFM2-KoEn-Tuning"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
affjljoo3581/GPT2
PyTorch Implementation of OpenAI GPT-2
48
akanyaani/Illustrated_GPT2_With_Code
Explained GPT-2 Transformer model step by step with code.
29
b14ucky/Taco-LLMingway
Custom GPT Transformer architecture built from scratch in PyTorch. Trained on Taco Hemingway's...
19
dheeren-tejani/mini-lm-124m
Experimental GPT-2 scale (~124M param) LLM trained from scratch on Google Colab. Trained on C4,...
11