jianghoucheng/AlphaEdit
AlphaEdit: Null-Space Constrained Knowledge Editing for Language Models, ICLR 2025 (Outstanding Paper)
This helps large language model developers update specific factual knowledge within an LLM without accidentally altering other, unrelated information. You input an existing LLM and the new facts or corrections you want to apply. It then outputs an updated LLM that has learned the new information while preserving its original knowledge base. This is for LLM developers or researchers who need to efficiently and precisely modify an LLM's knowledge.
423 stars.
Use this if you need to precisely update specific factual knowledge within a large language model while ensuring its existing, correct knowledge remains undisturbed.
Not ideal if you are looking for a tool to fine-tune an LLM for broad task improvement or if you are not working directly with LLM weights and parameters.
Stars
423
Forks
44
Language
Python
License
MIT
Category
Last pushed
Oct 15, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/jianghoucheng/AlphaEdit"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
steering-vectors/steering-vectors
Steering vectors for transformer language models in Pytorch / Huggingface
kmeng01/memit
Mass-editing thousands of facts into a transformer memory (ICLR 2023)
boyiwei/alignment-attribution-code
[ICML 2024] Assessing the Brittleness of Safety Alignment via Pruning and Low-Rank Modifications
jianghoucheng/AnyEdit
AnyEdit: Edit Any Knowledge Encoded in Language Models, ICML 2025
zjunlp/KnowledgeCircuits
[NeurIPS 2024] Knowledge Circuits in Pretrained Transformers