SwiftyAI/SwiftyMLC
An example of integrating local LLMS using mlc-llm into an iOS app
This project helps iOS app developers integrate local large language models (LLMs) directly into their mobile applications. It takes pre-trained LLM models and allows them to run entirely on the user's iPhone or iPad, providing advanced AI capabilities without relying on cloud services. Developers can then build features like offline chatbots or on-device text generation into their iOS apps.
No commits in the last 6 months.
Use this if you are an iOS developer looking to add advanced, privacy-preserving AI features that run directly on your users' devices, independent of an internet connection.
Not ideal if you are developing for a platform other than iOS, or if your application requires extremely large models that exceed device memory limitations.
Stars
9
Forks
—
Language
Swift
License
—
Category
Last pushed
Dec 11, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/SwiftyAI/SwiftyMLC"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
containers/ramalama
RamaLama is an open-source developer tool that simplifies the local serving of AI models from...
av/harbor
One command brings a complete pre-wired LLM stack with hundreds of services to explore.
RunanywhereAI/runanywhere-sdks
Production ready toolkit to run AI locally
runpod-workers/worker-vllm
The RunPod worker template for serving our large language model endpoints. Powered by vLLM.
foldl/chatllm.cpp
Pure C++ implementation of several models for real-time chatting on your computer (CPU & GPU)