MeryylleA/lunariscodex

A high-performance PyTorch toolkit for pre-training modern, Llama-style language models. Based on nanoGPT with significant architectural enhancements.

42
/ 100
Emerging

This toolkit helps machine learning researchers and engineers efficiently train custom, high-performance large language models (LLMs) from scratch. You provide your own unique text datasets, and it outputs a ready-to-use Llama-style language model, complete with state-of-the-art architectural features. It's designed for those building specialized AI models for various applications.

Use this if you need to pre-train a powerful, Llama-style language model on your specific dataset, demanding high performance and stability for large-scale training jobs.

Not ideal if you're looking to fine-tune an existing model, perform basic text analysis, or don't have extensive computational resources for training.

Large Language Models Generative AI Model Pre-training Natural Language Processing Deep Learning Research
No Package No Dependents
Maintenance 10 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 11 / 25

How are scores calculated?

Stars

13

Forks

2

Language

Python

License

MIT

Last pushed

Jan 30, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/MeryylleA/lunariscodex"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.