ziansu/codeart
Official repo for FSE'24 paper "CodeArt: Better Code Models by Attention Regularization When Symbols Are Lacking"
CodeArt is for researchers and developers working with code models, especially in situations where code is highly obfuscated or lacks clear symbolic information. It helps improve the performance of these models by providing a new training method. The output is a more accurate and robust code model.
No commits in the last 6 months.
Use this if you are training or fine-tuning code models and struggling with their performance when dealing with code that has limited or obscured symbolic information.
Not ideal if you are looking for an off-the-shelf code analysis tool or if your primary concern is with code that has rich and clear symbolic representation.
Stars
18
Forks
2
Language
Python
License
MIT
Category
Last pushed
Mar 10, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ziansu/codeart"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lucidrains/x-transformers
A concise but complete full-attention transformer with a set of promising experimental features...
kanishkamisra/minicons
Utility for behavioral and representational analyses of Language Models
lucidrains/simple-hierarchical-transformer
Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT
lucidrains/dreamer4
Implementation of Danijar's latest iteration for his Dreamer line of work
Nicolepcx/Transformers-in-Action
This is the corresponding code for the book Transformers in Action