callstackincubator/ai

On-device LLM execution in React Native with Vercel AI SDK compatibility

52
/ 100
Established

This helps mobile app developers integrate AI capabilities directly into their React Native applications, allowing for features like text generation, embeddings, transcription, and speech synthesis. It takes user prompts, audio, or text within the app and provides AI-generated text, audio, or data. This is for developers building privacy-focused mobile applications that need fast, on-device AI features without relying on cloud services.

1,219 stars. Actively maintained with 4 commits in the last 30 days.

Use this if you are a mobile app developer building a React Native application and want to add AI features that run directly on the user's device, ensuring privacy and low latency.

Not ideal if you are developing a web application, require extremely large and complex AI models, or prefer to handle all AI processing on a remote server.

mobile-app-development on-device-AI privacy-preserving-apps react-native AI-integration
No Package No Dependents
Maintenance 13 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 13 / 25

How are scores calculated?

Stars

1,219

Forks

41

Language

TypeScript

License

MIT

Last pushed

Mar 02, 2026

Commits (30d)

4

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/callstackincubator/ai"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.