LocalLLMClient and llmfarm_core.swift

LocalLLMClient
56
Established
llmfarm_core.swift
54
Established
Maintenance 10/25
Adoption 10/25
Maturity 15/25
Community 21/25
Maintenance 6/25
Adoption 10/25
Maturity 16/25
Community 22/25
Stars: 168
Forks: 41
Downloads:
Commits (30d): 0
Language: Swift
License: MIT
Stars: 278
Forks: 57
Downloads:
Commits (30d): 0
Language: C++
License: MIT
No Package No Dependents
No Package No Dependents

About LocalLLMClient

tattn/LocalLLMClient

Swift package to run local LLMs on iOS, macOS, Linux

This Swift package helps developers integrate large language models (LLMs) directly into their iOS, macOS, or Linux applications. Developers can use it to add AI capabilities like text generation, question answering, and even image analysis. It takes local LLM files and user prompts as input, and outputs text responses or tool calls, enabling apps to perform intelligent tasks offline.

iOS-development macOS-development mobile-app-AI offline-AI local-LLM-integration

About llmfarm_core.swift

guinmoon/llmfarm_core.swift

Swift library to work with llama and other large language models.

This is a Swift library designed for developers who want to integrate large language models (LLMs) like LLaMA into their macOS and iOS applications. It allows you to load various LLMs and configure their behavior with different inference and sampling methods. Developers can use this to build applications that run LLMs directly on Apple devices.

mobile-app-development desktop-app-development AI-integration machine-learning-engineering Swift-development

Scores updated daily from GitHub, PyPI, and npm data. How scores work