xarillian/GDLlama
A working and actively maintained GDExtension for running local LLMs in Godot. Power dynamic NPC dialogue, quests, and more all from the user's home machine.
GDLlama helps game developers using Godot 4.4+ create more dynamic and engaging games by running large language models (LLMs) directly within their game. This means you can generate character dialogue, quest details, or other game content without needing an internet connection. It takes a pre-trained LLM and integrates it into your game, outputting conversational text or structured data for interactive game elements. Game developers and designers can use this to enhance player experiences.
Use this if you are a Godot game developer and want to integrate advanced AI features like dynamic NPC dialogue, quest generation, or semantic search directly into your game, running locally without external servers.
Not ideal if you are not using the Godot game engine or if you need to perform LLM inference on remote servers rather than locally within your game.
Stars
9
Forks
—
Language
C++
License
MIT
Category
Last pushed
Jan 13, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/xarillian/GDLlama"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mishushakov/llm-scraper
Turn any webpage into structured data using LLMs
Mobile-Artificial-Intelligence/maid
Maid is a free and open source application for interfacing with llama.cpp models locally, and...
run-llama/LlamaIndexTS
Data framework for your LLM applications. Focus on server side solution
JHubi1/ollama-app
A modern and easy-to-use client for Ollama
serge-chat/serge
A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.