CogitatorTech/infera
A DuckDB extension for in-database inference
This project helps data analysts and data scientists apply machine learning models directly within their database queries. Instead of exporting data to an external tool, running a model, and then re-importing results, you can input your data tables into a DuckDB extension and get predictions or classifications instantly. This is ideal for anyone who works with large datasets in DuckDB and needs to integrate machine learning inference into their data analysis workflow.
128 stars.
Use this if you need to run machine learning predictions or classifications on data stored in DuckDB without moving it out of the database.
Not ideal if you need to train machine learning models or use models not in the ONNX format.
Stars
128
Forks
5
Language
Rust
License
Apache-2.0
Category
Last pushed
Mar 12, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/CogitatorTech/infera"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
tracel-ai/burn
Burn is a next generation tensor library and Deep Learning Framework that doesn't compromise on...
sonos/tract
Tiny, no-nonsense, self-contained, Tensorflow and ONNX inference
pykeio/ort
Fast ML inference & training for ONNX models in Rust
elixir-nx/ortex
ONNX Runtime bindings for Elixir
robertknight/rten
ONNX neural network inference engine