BatsResearch/bonito

A lightweight library for generating synthetic instruction tuning datasets for your data without GPT.

43
/ 100
Emerging

Bonito helps machine learning engineers and researchers quickly create new training data for instruction-tuned language models. You provide unannotated text, and Bonito generates relevant question-answer pairs or other task-specific examples. This is ideal for fine-tuning models on specialized tasks where labeled data is scarce or expensive to produce manually.

823 stars. No commits in the last 6 months.

Use this if you need to rapidly generate synthetic training datasets for custom instruction-tuning tasks without relying on large, external language models like GPT.

Not ideal if you require perfectly human-quality, nuanced annotations for extremely sensitive or safety-critical applications.

large-language-models natural-language-processing dataset-generation model-fine-tuning AI-research
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

823

Forks

56

Language

Python

License

BSD-3-Clause

Last pushed

Jul 15, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/BatsResearch/bonito"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.