HenryNdubuaku/nanodl

Build GPT, Gemma, LlaMa, Mixtral, Whisper, SWin, ViT and more in JAX.

36
/ 100
Emerging

This tool helps AI/ML experts and researchers quickly build and train smaller, custom versions of large transformer models for specific problems. You provide your specialized datasets (text, images, or audio), and it outputs a trained, custom AI model like a GPT variant, a vision transformer, or a Whisper-style audio model, ready for your unique tasks. It's designed for those who need to experiment with and deploy highly tailored AI solutions.

299 stars. No commits in the last 6 months.

Use this if you are an AI/ML expert or researcher looking to design, train, and experiment with custom transformer-based models from scratch, especially when working with specialized datasets or needing a low-resource framework like JAX.

Not ideal if you are a business user looking for a ready-to-use, off-the-shelf AI solution without deep technical engagement, or if you prefer frameworks other than JAX for model development.

AI-model-development natural-language-processing computer-vision audio-processing machine-learning-research
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 10 / 25

How are scores calculated?

Stars

299

Forks

12

Language

Python

License

MIT

Last pushed

Aug 28, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/HenryNdubuaku/nanodl"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.