SIC98/GPT2-python-code-generator
GPT2 finetuning with transformers 🤗
This project helps Python developers generate code by providing a tool that auto-completes Python snippets. You input a starting piece of Python code, and the tool outputs a longer, plausible continuation of that code. It's designed for Python developers who want assistance in writing or exploring code structures.
No commits in the last 6 months.
Use this if you are a Python developer looking for an experimental code generation tool to assist with writing boilerplate or exploring code patterns.
Not ideal if you need production-ready code generation or a tool that understands complex programming logic beyond simple auto-completion.
Stars
28
Forks
3
Language
Jupyter Notebook
License
—
Category
Last pushed
Feb 07, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/SIC98/GPT2-python-code-generator"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
tabularis-ai/be_great
A novel approach for synthesizing tabular data using pretrained large language models
EleutherAI/gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron...
shibing624/textgen
TextGen: Implementation of Text Generation models, include LLaMA, BLOOM, GPT2, BART, T5, SongNet...
ai-forever/ru-gpts
Russian GPT3 models.
AdityaNG/kan-gpt
The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold...