otadk/nuxt-edge-ai
Nuxt module for local-first AI apps with server-side WASM inference via Transformers.js and ONNX Runtime.
This project helps web developers integrate AI features directly into their Nuxt applications, allowing them to run AI models locally on the server without needing external services like Python. Developers provide text input and receive AI-generated text output. It's designed for web developers building interactive applications that need to process natural language.
Available on npm.
Use this if you are a web developer building a Nuxt application and want to embed AI capabilities that run locally on your server, potentially with a fallback to a remote AI service.
Not ideal if you need a streaming AI response, require integration with platforms like Ollama, or are not building a Nuxt application.
Stars
33
Forks
—
Language
TypeScript
License
MIT
Category
Last pushed
Mar 17, 2026
Commits (30d)
0
Dependencies
1
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/otadk/nuxt-edge-ai"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
huggingface/transformers.js
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with...
huggingface/transformers.js-examples
A collection of 🤗 Transformers.js demos and example applications
daviddaytw/react-native-transformers
Run local LLM from Huggingface in React-Native or Expo using onnxruntime.
salesforce/TransmogrifAI
TransmogrifAI (pronounced trăns-mŏgˈrə-fī) is an AutoML library for building modular, reusable,...
jobergum/browser-ml-inference
Edge Inference in Browser with Transformer NLP model