rozek/node-red-flow-llama
Node-RED Flow (and web page example) for the LLaMA AI model
This project helps you build custom applications that leverage the LLaMA large language model without needing specialized hardware. You can feed text prompts into a Node-RED flow and receive generated text, tokenized data, or text embeddings as output. This is for developers, system integrators, or advanced hobbyists who want to embed LLaMA's AI capabilities into their existing Node-RED-based automation or custom interfaces.
No commits in the last 6 months.
Use this if you are a Node-RED user and want to integrate LLaMA's text generation, tokenization, or embedding capabilities into your automation flows or custom web applications running on standard CPU hardware.
Not ideal if you are looking for a ready-to-use application or a simple API to access LLaMA without needing to set up a Node-RED environment or manage model files.
Stars
11
Forks
1
Language
HTML
License
MIT
Category
Last pushed
Jul 27, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/rozek/node-red-flow-llama"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mishushakov/llm-scraper
Turn any webpage into structured data using LLMs
Mobile-Artificial-Intelligence/maid
Maid is a free and open source application for interfacing with llama.cpp models locally, and...
run-llama/LlamaIndexTS
Data framework for your LLM applications. Focus on server side solution
JHubi1/ollama-app
A modern and easy-to-use client for Ollama
serge-chat/serge
A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.