rozek/node-red-flow-llama

Node-RED Flow (and web page example) for the LLaMA AI model

28
/ 100
Experimental

This project helps you build custom applications that leverage the LLaMA large language model without needing specialized hardware. You can feed text prompts into a Node-RED flow and receive generated text, tokenized data, or text embeddings as output. This is for developers, system integrators, or advanced hobbyists who want to embed LLaMA's AI capabilities into their existing Node-RED-based automation or custom interfaces.

No commits in the last 6 months.

Use this if you are a Node-RED user and want to integrate LLaMA's text generation, tokenization, or embedding capabilities into your automation flows or custom web applications running on standard CPU hardware.

Not ideal if you are looking for a ready-to-use application or a simple API to access LLaMA without needing to set up a Node-RED environment or manage model files.

Node-RED automation custom AI applications text generation text embeddings local AI inference
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 16 / 25
Community 7 / 25

How are scores calculated?

Stars

11

Forks

1

Language

HTML

License

MIT

Last pushed

Jul 27, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/rozek/node-red-flow-llama"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.