yybit/pllm
Portable LLM - A rust library for LLM inference
This is a Rust library that allows developers to integrate large language model (LLM) inference capabilities directly into their applications. It takes pre-trained LLM models (specifically Llama2 and Gemma in GGUF format) as input and outputs generated text based on a given prompt. This is for Rust developers who want to experiment with or build LLM features into their own projects.
No commits in the last 6 months.
Use this if you are a Rust developer looking to learn about or experiment with local LLM inference, particularly with Llama2 or Gemma models, and are comfortable with a library that is not yet production-ready.
Not ideal if you need a stable, production-grade solution for deploying LLMs, or if you are not a Rust developer.
Stars
11
Forks
1
Language
Rust
License
Apache-2.0
Category
Last pushed
Apr 13, 2024
Monthly downloads
6
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/yybit/pllm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
trymirai/uzu
A high-performance inference engine for AI models
justrach/bhumi
⚡ Bhumi – The fastest AI inference client for Python, built with Rust for unmatched speed,...
lipish/llm-connector
LLM Connector - A unified interface for connecting to various Large Language Model providers
keyvank/femtoGPT
Pure Rust implementation of a minimal Generative Pretrained Transformer
ShelbyJenkins/llm_client
The Easiest Rust Interface for Local LLMs and an Interface for Deterministic Signals from...