tattn/LocalLLMClient
Swift package to run local LLMs on iOS, macOS, Linux
This Swift package helps developers integrate large language models (LLMs) directly into their iOS, macOS, or Linux applications. Developers can use it to add AI capabilities like text generation, question answering, and even image analysis. It takes local LLM files and user prompts as input, and outputs text responses or tool calls, enabling apps to perform intelligent tasks offline.
168 stars.
Use this if you are a Swift developer building an application for Apple platforms or Linux and want to embed AI capabilities using local LLMs without relying on cloud services.
Not ideal if you are a non-developer or if your application requires a wide range of LLM models beyond those supported by GGUF, MLX, or FoundationModels.
Stars
168
Forks
41
Language
Swift
License
MIT
Category
Last pushed
Jan 31, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/tattn/LocalLLMClient"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
ludwig-ai/ludwig
Low-code framework for building custom LLMs, neural networks, and other AI models
withcatai/node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema...
mudler/LocalAI
:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and...
zhudotexe/kani
kani (カニ) is a highly hackable microframework for tool-calling language models. (NLP-OSS @ EMNLP 2023)
SciSharp/LLamaSharp
A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.