IBJunior/local-agent-docker-model-runner
A flexible, extensible AI agent backend built with NestJS—designed for running local, open-source LLMs (Llama, Gemma, Qwen, DeepSeek, etc.) via Docker Model Runner. Real-time streaming, Redis messaging, web search, and Postgres memory out of the box. No cloud APIs required!
No commits in the last 6 months.
Stars
6
Forks
—
Language
TypeScript
License
—
Category
Last pushed
Jun 22, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/IBJunior/local-agent-docker-model-runner"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
fractalic-ai/fractalic
Fractalic: Build and version-control AI systems using Markdown & YAML. Combine LLM calls, shell...
Haohao-end/LMForge-End-to-End-LLMOps-Platform-for-Multi-Model-Agents
AI Agent Development Platform - Supports multiple models (OpenAI/DeepSeek/Wenxin/Tongyi),...
qntx/x402-openai-python
Drop-in OpenAI Python client with transparent x402 payment support.
PrefectHQ/marvin
an ambient intelligence library
ArcadeAI/arcade-py
Official Arcade Python Client