bigcode-project/selfcodealign
[NeurIPS'24] SelfCodeAlign: Self-Alignment for Code Generation
This project offers a method to automatically improve the performance of code-generating AI models. It takes an existing code language model and uses a self-alignment process to produce a new, more capable version of that model. This is for researchers and developers who build or train large language models specifically for coding tasks.
323 stars. No commits in the last 6 months.
Use this if you are a machine learning researcher or engineer aiming to enhance a code generation model's capabilities without relying on expensive human annotations or proprietary model data.
Not ideal if you are looking for a ready-to-use code generation tool or if you do not have expertise in training large language models.
Stars
323
Forks
24
Language
Python
License
Apache-2.0
Category
Last pushed
Feb 24, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/bigcode-project/selfcodealign"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
steering-vectors/steering-vectors
Steering vectors for transformer language models in Pytorch / Huggingface
jianghoucheng/AlphaEdit
AlphaEdit: Null-Space Constrained Knowledge Editing for Language Models, ICLR 2025 (Outstanding Paper)
kmeng01/memit
Mass-editing thousands of facts into a transformer memory (ICLR 2023)
boyiwei/alignment-attribution-code
[ICML 2024] Assessing the Brittleness of Safety Alignment via Pruning and Low-Rank Modifications
jianghoucheng/AnyEdit
AnyEdit: Edit Any Knowledge Encoded in Language Models, ICML 2025