hurui200320/llama-cpp-kt
The Kotlin wrapper of llama.cpp, powered by JNA
This project helps Kotlin developers integrate the high-performance LLaMA C/C++ library into their JVM applications. It allows you to use `llama.cpp`'s capabilities, such as loading and quantizing large language models, directly within Kotlin code. This is ideal for developers building backend services or desktop applications that require local LLM inference.
No commits in the last 6 months.
Use this if you are a Kotlin developer who needs to embed large language model inference capabilities into your JVM-based applications, leveraging the efficiency of `llama.cpp`.
Not ideal if you are not a Kotlin developer or if you prefer using higher-level, pre-built LLM frameworks with extensive abstractions rather than low-level bindings.
Stars
13
Forks
1
Language
Kotlin
License
MIT
Category
Last pushed
Aug 08, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/hurui200320/llama-cpp-kt"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ludwig-ai/ludwig
Low-code framework for building custom LLMs, neural networks, and other AI models
withcatai/node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema...
mudler/LocalAI
:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and...
zhudotexe/kani
kani (カニ) is a highly hackable microframework for tool-calling language models. (NLP-OSS @ EMNLP 2023)
SciSharp/LLamaSharp
A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.