mhayes853/swift-cactus

Crossplatform Swift SDK for Cactus hybrid inference and running LLMs locally in your app.

36
/ 100
Emerging

This project helps app developers build mobile and wearable applications that can use large language models (LLMs) and other AI capabilities directly on users' devices. Developers can integrate AI features like smart assistants, image analysis, speech-to-text transcription, and language detection into their apps. It takes a pre-trained AI model in Cactus format and outputs AI-powered responses or data within the app.

Use this if you are a mobile or embedded application developer aiming to add advanced AI features directly into your app for better performance and user privacy, especially for Apple, Android, or ARM Linux platforms.

Not ideal if you are looking for a standalone AI application or a tool for general data analysis outside of app development.

mobile-app-development embedded-systems on-device-AI speech-recognition computer-vision
No Package No Dependents
Maintenance 10 / 25
Adoption 6 / 25
Maturity 15 / 25
Community 5 / 25

How are scores calculated?

Stars

18

Forks

1

Language

Swift

License

MIT

Last pushed

Mar 09, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/mhayes853/swift-cactus"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.