LocalLLMClient and LLMFarm

LocalLLMClient
56
Established
LLMFarm
55
Established
Maintenance 10/25
Adoption 10/25
Maturity 15/25
Community 21/25
Maintenance 10/25
Adoption 10/25
Maturity 16/25
Community 19/25
Stars: 168
Forks: 41
Downloads:
Commits (30d): 0
Language: Swift
License: MIT
Stars: 1,994
Forks: 163
Downloads:
Commits (30d): 0
Language: C
License: MIT
No Package No Dependents
No Package No Dependents

About LocalLLMClient

tattn/LocalLLMClient

Swift package to run local LLMs on iOS, macOS, Linux

This Swift package helps developers integrate large language models (LLMs) directly into their iOS, macOS, or Linux applications. Developers can use it to add AI capabilities like text generation, question answering, and even image analysis. It takes local LLM files and user prompts as input, and outputs text responses or tool calls, enabling apps to perform intelligent tasks offline.

iOS-development macOS-development mobile-app-AI offline-AI local-LLM-integration

About LLMFarm

guinmoon/LLMFarm

llama and other large language models on iOS and MacOS offline using GGML library.

This app allows you to run large language models (LLMs) like LLaMA and GPT2 directly on your iPhone or Mac, without needing an internet connection. You input a chosen language model and receive text generation or responses, enabling private and fast AI interactions. It's designed for anyone who wants to experiment with or use AI language models offline on Apple devices.

offline-AI personal-AI-assistant local-language-models private-text-generation AI-experimentation

Scores updated daily from GitHub, PyPI, and npm data. How scores work