yybit/pllm

Portable LLM - A rust library for LLM inference

30
/ 100
Emerging

This is a Rust library that allows developers to integrate large language model (LLM) inference capabilities directly into their applications. It takes pre-trained LLM models (specifically Llama2 and Gemma in GGUF format) as input and outputs generated text based on a given prompt. This is for Rust developers who want to experiment with or build LLM features into their own projects.

No commits in the last 6 months.

Use this if you are a Rust developer looking to learn about or experiment with local LLM inference, particularly with Llama2 or Gemma models, and are comfortable with a library that is not yet production-ready.

Not ideal if you need a stable, production-grade solution for deploying LLMs, or if you are not a Rust developer.

LLM development Rust programming AI model integration Text generation Machine learning engineering
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 7 / 25

How are scores calculated?

Stars

11

Forks

1

Language

Rust

License

Apache-2.0

Last pushed

Apr 13, 2024

Monthly downloads

6

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/yybit/pllm"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.