dkopi/Bitune

Implementation of Bitune: Bidirectional Instruction-Tuning

24
/ 100
Experimental

This project helps machine learning researchers reproduce the results from the Bitune paper on improving large language models. It takes a base language model and instruction-tuning datasets as input. The output is a fine-tuned language model that can perform better on various downstream tasks. This is primarily for academic researchers or engineers working on cutting-edge language model development.

No commits in the last 6 months.

Use this if you are a researcher aiming to validate or build upon the 'Bitune: Bidirectional Instruction-Tuning' paper's findings using the original implementation.

Not ideal if you are looking for a user-friendly tool to fine-tune a language model without deep technical knowledge of the underlying research and implementation details.

large-language-models instruction-tuning natural-language-processing machine-learning-research model-fine-tuning
No License Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 7 / 25
Maturity 8 / 25
Community 7 / 25

How are scores calculated?

Stars

25

Forks

2

Language

Python

License

Last pushed

Jun 19, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/dkopi/Bitune"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.