IST-DASLab/Quartet-II

Quartet II Official Code

30
/ 100
Emerging

This project helps machine learning engineers and researchers optimize the pre-training process for large language models. It provides tools and kernels to train these models using NVFP4 precision, a more efficient format, while maintaining accuracy. The project takes existing large language model architectures and training data, and outputs a more efficiently trained model.

Use this if you are a machine learning engineer or researcher focused on pre-training large language models and want to reduce computational costs and memory footprint without sacrificing model accuracy.

Not ideal if you are looking for a high-level API for using pre-trained models or for training smaller, non-LLM machine learning models.

large-language-model-training deep-learning-optimization AI-model-development computational-efficiency
No License No Package No Dependents
Maintenance 10 / 25
Adoption 8 / 25
Maturity 3 / 25
Community 9 / 25

How are scores calculated?

Stars

53

Forks

4

Language

Python

License

Last pushed

Mar 01, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/IST-DASLab/Quartet-II"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.