jlippp/litesearch

Run autosearch on any NVIDIA GPUs (Works on 2-4GB+ Cards)

20
/ 100
Experimental

This helps AI researchers and hobbyists efficiently train small to medium-sized language models on their personal NVIDIA GPUs. You provide your training data and it automatically adjusts model size and training parameters to fit your GPU's memory. The outcome is a trained language model and a log of experiments, helping you iterate towards a better model faster.

Use this if you want to autonomously experiment with and train custom large language models using an NVIDIA consumer GPU, without needing vast cloud resources.

Not ideal if you need to train models on AMD or Intel GPUs, or if you are working with extremely large models that require enterprise-grade GPU clusters.

AI research LLM training model optimization deep learning GPU computing
No License No Package No Dependents
Maintenance 13 / 25
Adoption 6 / 25
Maturity 1 / 25
Community 0 / 25

How are scores calculated?

Stars

17

Forks

Language

Python

License

Last pushed

Mar 22, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/jlippp/litesearch"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.