Lallapallooza/gpt.rs
Rust LLM playground: build, train, generate on pluggable backends
This project helps machine learning practitioners and researchers efficiently run and experiment with large language models (LLMs) and other deep learning models. It takes model weights and configurations as input and produces text generation or model predictions with improved performance. End-users are primarily developers and researchers building or integrating advanced AI models.
Use this if you are a developer or researcher looking to optimize the runtime performance of deep learning models, especially large language models, by leveraging portable tensor programs and custom backend execution.
Not ideal if you are an end-user seeking a high-level API or a stable, production-ready solution for deploying pre-trained models without deep technical involvement.
Stars
15
Forks
1
Language
Rust
License
Apache-2.0
Category
Last pushed
Feb 27, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/Lallapallooza/gpt.rs"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
trymirai/uzu
A high-performance inference engine for AI models
justrach/bhumi
⚡ Bhumi – The fastest AI inference client for Python, built with Rust for unmatched speed,...
lipish/llm-connector
LLM Connector - A unified interface for connecting to various Large Language Model providers
keyvank/femtoGPT
Pure Rust implementation of a minimal Generative Pretrained Transformer
ShelbyJenkins/llm_client
The Easiest Rust Interface for Local LLMs and an Interface for Deterministic Signals from...