frost-beta/llm.js
Node.js module providing inference APIs for large language models, with simple CLI.
This is a tool for developers who want to integrate large language models (LLMs) into their Node.js applications or scripts. It allows you to take pre-trained LLM weights, interact with them through a command-line interface, or programmatically use them to generate text, process chat conversations, and even analyze images when using vision-enabled models. Developers can use this to add AI capabilities to their projects.
No commits in the last 6 months.
Use this if you are a Node.js developer looking to add large language model inference capabilities to your applications, especially on Apple Silicon Macs or Linux machines.
Not ideal if you are a non-developer user looking for a ready-to-use application, or if you need to run LLMs on Windows machines or require extensive GPU support beyond Apple Silicon.
Stars
23
Forks
1
Language
TypeScript
License
MIT
Category
Last pushed
Dec 07, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/frost-beta/llm.js"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lmstudio-ai/lmstudio-js
LM Studio TypeScript SDK
lmstudio-ai/lms
LM Studio CLI
samestrin/llm-interface
A simple NPM interface for seamlessly interacting with 36 Large Language Model (LLM) providers,...
nbonamy/multi-llm-ts
A Typescript library to use LLM providers APIs in a unified way.
token-js/token.js
Integrate 200+ LLMs with one TypeScript SDK using OpenAI's format.