wyt2000/InverseCoder

[AAAI 2025] The official code of the paper "InverseCoder: Unleashing the Power of Instruction-Tuned Code LLMs with Inverse-Instruct"(https://arxiv.org/abs/2407.05700).

19
/ 100
Experimental

This project helps AI engineers and machine learning researchers enhance the capabilities of large language models for code generation. It takes existing code snippets and automatically generates high-quality programming instructions for them. The output is a refined dataset that makes code LLMs better at understanding and responding to natural language prompts for coding tasks.

No commits in the last 6 months.

Use this if you are a developer or researcher looking to create more robust and accurate code generation LLMs by automatically expanding and improving their training data.

Not ideal if you are looking for a tool to directly write code for your projects; this is a toolkit for training code generation models, not for direct code production.

AI-engineering LLM-training code-generation data-synthesis machine-learning-research
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 8 / 25
Community 6 / 25

How are scores calculated?

Stars

14

Forks

1

Language

Python

License

Last pushed

Jul 10, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/wyt2000/InverseCoder"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.