react-native-transformers and react-native-transformers-example
The first is a production library enabling ONNX Runtime-based LLM inference in React Native, while the second is a demonstration project showing how to use the JavaScript transformers.js library in the same environment—making them complementary approaches to the same problem rather than direct competitors or related ecosystem projects.
About react-native-transformers
daviddaytw/react-native-transformers
Run local LLM from Huggingface in React-Native or Expo using onnxruntime.
This project helps mobile app developers build applications that can run advanced AI language models (like those for text generation or answering questions) directly on users' phones. It takes pre-trained language models from platforms like Hugging Face and allows them to be embedded in React Native or Expo apps. The result is a mobile app that can process language tasks offline, quickly, and without sending user data to external servers, benefiting app developers focused on privacy and performance.
About react-native-transformers-example
hans00/react-native-transformers-example
Example of transformers.js on React Native
This project helps React Native developers integrate advanced machine learning capabilities directly into their mobile applications. It demonstrates how to use the Transformers.js library to run large language models locally on user devices, processing data like text inputs to generate outputs such as summarized content or classified text. Mobile app developers looking to add AI features without relying on cloud services would find this useful.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work