rizerphe/local-llm-function-calling
A tool for generating function arguments and choosing what function to call with local LLMs
This tool helps AI developers guide their local large language models (LLMs) to produce structured, specific outputs that can then trigger predefined functions or actions. You provide the LLM with a task description and a blueprint (JSON schema) of the data it should extract, and it outputs the structured data, ensuring it perfectly matches the specified format. It's for developers building applications where LLMs need to precisely extract information or decide on a specific tool to use based on user input.
439 stars. No commits in the last 6 months. Available on PyPI.
Use this if you are a developer integrating local LLMs into an application and need to ensure the LLM's text output strictly conforms to a JSON schema for reliable function calling or data extraction.
Not ideal if you are an end-user simply looking for a chat interface or general text generation without the need for structured, schema-constrained outputs.
Stars
439
Forks
41
Language
Python
License
MIT
Category
Last pushed
Mar 12, 2024
Commits (30d)
0
Dependencies
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/rizerphe/local-llm-function-calling"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
Maximilian-Winter/llama-cpp-agent
The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models...
mozilla-ai/any-llm
Communicate with an LLM provider using a single interface
CliDyn/climsight
A next-generation climate information system that uses large language models (LLMs) alongside...
ShishirPatil/gorilla
Gorilla: Training and Evaluating LLMs for Function Calls (Tool Calls)
OoriData/OgbujiPT
Client-side toolkit for using large language models, including where self-hosted