bayesgroup/code_transformers
Empirical Study of Transformers for Source Code & A Simple Approach for Handling Out-of-Vocabulary Identifiers in Deep Learning for Source Code
This project provides tools for software engineers and researchers to experiment with and evaluate transformer models for code-related tasks. It helps in analyzing and improving code quality by identifying variable misuse, suggesting accurate function names, and assisting with code completion. Input includes Python or JavaScript source code, and outputs are insights or model improvements for these specific code analysis problems. It's intended for deep learning practitioners working with code.
No commits in the last 6 months.
Use this if you are a deep learning researcher or software engineer developing models for code analysis tasks like identifying variable misuse or improving code completion.
Not ideal if you are an end-user developer looking for a ready-to-use tool for code quality checks or an IDE plugin.
Stars
66
Forks
19
Language
Python
License
—
Category
Last pushed
Dec 03, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/bayesgroup/code_transformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
huggingface/transformers
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in...
kyegomez/LongNet
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
pbloem/former
Simple transformer implementation from scratch in pytorch. (archival, latest version on codeberg)
NVIDIA/FasterTransformer
Transformer related optimization, including BERT, GPT
kyegomez/SimplifiedTransformers
SimplifiedTransformer simplifies transformer block without affecting training. Skip connections,...