mukel/llama3.java
Practical Llama 3 inference in Java
This project offers a way to run Meta's Llama 3, 3.1, and 3.2 large language models directly within Java applications. It takes a pre-trained Llama 3 model file (in GGUF format) and allows you to generate text or interact with the model in chat or instruct modes. This is designed for Java developers who want to integrate Llama 3 inference capabilities into their software without external dependencies.
800 stars. Actively maintained with 2 commits in the last 30 days.
Use this if you are a Java developer building an application that needs to embed Llama 3 text generation or conversational AI features directly.
Not ideal if you are not a Java developer or if you need to fine-tune Llama 3 models or integrate with other programming languages.
Stars
800
Forks
92
Language
Java
License
MIT
Category
Last pushed
Feb 08, 2026
Commits (30d)
2
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/mukel/llama3.java"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
ludwig-ai/ludwig
Low-code framework for building custom LLMs, neural networks, and other AI models
withcatai/node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema...
mudler/LocalAI
:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and...
zhudotexe/kani
kani (カニ) is a highly hackable microframework for tool-calling language models. (NLP-OSS @ EMNLP 2023)
SciSharp/LLamaSharp
A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.