Code Model Training Transformer Models
There are 16 code model training models tracked. The highest-rated is oripress/AlgoTune at 49/100 with 95 stars.
Get all 16 projects as JSON
curl "https://pt-edge.onrender.com/api/v1/datasets/quality?domain=transformers&subcategory=code-model-training&limit=20"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
| # | Model | Score | Tier |
|---|---|---|---|
| 1 |
oripress/AlgoTune
AlgoTune is a NeurIPS 2025 benchmark made up of 154 math, physics, and... |
|
Emerging |
| 2 |
xjywhu/Awesome-Multimodal-LLM-for-Code
Multimodal Large Language Models for Code Generation under Multimodal Scenarios |
|
Emerging |
| 3 |
jie-jw-wu/human-eval-comm
HumanEvalComm: Evaluating Communication Skill of Code LLM and LLM Agent |
|
Emerging |
| 4 |
juyongjiang/CodeUp
CodeUp: A Multilingual Code Generation Llama-X Model with... |
|
Emerging |
| 5 |
JHansiduYapa/Fine-Tuning-a-Small-Language-Model-for-Cypher-Query-Generation
This project fine-tunes Unsloth's Gemma-3 4B IT (4-bit) model to translate... |
|
Emerging |
| 6 |
Gen-Verse/ReasonFlux
[NeurIPS 2025 Spotlight] LLM post-training suite — featuring ReasonFlux,... |
|
Emerging |
| 7 |
martin-wey/cl-code-apis
Replication package of the paper "On the Usage of Continual Learning for... |
|
Emerging |
| 8 |
skpig/MPSC
[ACL 2024] Enhancing Large Language Models in Coding Through... |
|
Emerging |
| 9 |
xlang-ai/text2reward
[ICLR 2024 Spotlight] Text2Reward: Reward Shaping with Language Models for... |
|
Experimental |
| 10 |
amazon-science/llm-code-preference
Training and Benchmarking LLMs for Code Preference. |
|
Experimental |
| 11 |
sanskar9999/CodeEvolveLLM
A framework for using local LLMs (Qwen2.5-coder 7B) that are fine-tuned... |
|
Experimental |
| 12 |
TingjiaInFuture/pixrep
Let LLMs see your codebase just like you do. |
|
Experimental |
| 13 |
carlos-life/OpenEvolve
Evolve algorithms with LLMs. Open-source AlphaEvolve alternative. Uses... |
|
Experimental |
| 14 |
PAN001/LeToRr
LeToRr: Learning to Re-rank with Application in Code Generation |
|
Experimental |
| 15 |
Training-Datasmith/olmo3-code-150m-pretrain
Pre-training a ~150M parameter code-specialized language model using OLMo 3... |
|
Experimental |
| 16 |
dakshjain-1616/nemotron3-super-vs-gpt5.4-nano
Head-to-head benchmark comparing Nemotron and GPT-5.4-nano on code generation tasks |
|
Experimental |