wxhcore/bumblecore
An LLM training framework built from the ground up, featuring a custom BumbleBee architecture and end-to-end support for multiple open-source models across Pretraining → SFT → RLHF/DPO.
This framework helps deep learning researchers and algorithm engineers build and train large language models (LLMs) from the ground up. It takes raw text data or instruction-formatted datasets and produces a custom-trained LLM ready for specific applications, giving the user complete control over every aspect of the training process. Learners who want to understand LLM internals would also find this useful.
Use this if you need fine-grained control over the entire large language model training pipeline, from architecture design to specific loss functions, and want to deeply customize existing models or create new ones.
Not ideal if you prefer high-level APIs or pre-built solutions for quick, off-the-shelf LLM fine-tuning without diving into the underlying implementation details.
Stars
63
Forks
13
Language
Python
License
Apache-2.0
Category
Last pushed
Feb 09, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/wxhcore/bumblecore"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
NX-AI/xlstm
Official repository of the xLSTM.
sinanuozdemir/oreilly-hands-on-gpt-llm
Mastering the Art of Scalable and Efficient AI Model Deployment
DashyDashOrg/pandas-llm
Pandas-LLM
MiniMax-AI/MiniMax-01
The official repo of MiniMax-Text-01 and MiniMax-VL-01, large-language-model &...
verifai/multiLLM
🚀 Invoke multiple large language models concurrently and the rank results. Add new models and...