microsoft/aici
AICI: Prompts as (Wasm) Programs
This project helps developers build custom logic that guides how large language models (LLMs) generate text. You provide the LLM (like Llama or Phi-2) and your custom 'controller' program, which then constrains and directs the LLM's output in real time. This tool is for software engineers and AI researchers who are building and experimenting with advanced LLM applications.
2,064 stars. No commits in the last 6 months.
Use this if you need to precisely control LLM output, implement advanced programmatic decoding, or orchestrate complex multi-agent conversations directly within the LLM's generation process.
Not ideal if you are a non-developer seeking an out-of-the-box solution for basic constrained text generation; for that, consider dedicated libraries like LLGuidance.
Stars
2,064
Forks
82
Language
Rust
License
MIT
Category
Last pushed
Jan 22, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/microsoft/aici"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
trymirai/uzu
A high-performance inference engine for AI models
justrach/bhumi
⚡ Bhumi – The fastest AI inference client for Python, built with Rust for unmatched speed,...
lipish/llm-connector
LLM Connector - A unified interface for connecting to various Large Language Model providers
keyvank/femtoGPT
Pure Rust implementation of a minimal Generative Pretrained Transformer
ShelbyJenkins/llm_client
The Easiest Rust Interface for Local LLMs and an Interface for Deterministic Signals from...