gotzmann/llama.go

llama.go is like llama.cpp in pure Golang!

42
/ 100
Emerging

This project allows developers to run large language models (LLMs) like LLaMA directly on their personal computers or servers without needing expensive GPU clusters. It takes existing LLaMA model files as input and outputs text generations based on user prompts. Software developers and engineers who want to embed LLM capabilities into their Go applications or run local AI inference will find this useful.

1,398 stars. No commits in the last 6 months.

Use this if you are a developer looking to integrate large language models into Go applications or run local LLM inference on CPU-only machines efficiently.

Not ideal if you are looking for a plug-and-play, end-user application with a graphical interface, or if you primarily work with Python or other languages and don't require a Go-specific solution.

AI-inference large-language-models Go-development natural-language-processing edge-AI
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 16 / 25

How are scores calculated?

Stars

1,398

Forks

70

Language

Go

License

Last pushed

Sep 20, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/gotzmann/llama.go"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.