frost-beta/llm.js

Node.js module providing inference APIs for large language models, with simple CLI.

26
/ 100
Experimental

This is a tool for developers who want to integrate large language models (LLMs) into their Node.js applications or scripts. It allows you to take pre-trained LLM weights, interact with them through a command-line interface, or programmatically use them to generate text, process chat conversations, and even analyze images when using vision-enabled models. Developers can use this to add AI capabilities to their projects.

No commits in the last 6 months.

Use this if you are a Node.js developer looking to add large language model inference capabilities to your applications, especially on Apple Silicon Macs or Linux machines.

Not ideal if you are a non-developer user looking for a ready-to-use application, or if you need to run LLMs on Windows machines or require extensive GPU support beyond Apple Silicon.

AI-integration Node.js-development natural-language-processing machine-learning-inference chatbot-development
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 4 / 25

How are scores calculated?

Stars

23

Forks

1

Language

TypeScript

License

MIT

Last pushed

Dec 07, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/frost-beta/llm.js"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.