shibing624/textgen
TextGen: Implementation of Text Generation models, include LLaMA, BLOOM, GPT2, BART, T5, SongNet and so on. 文本生成模型,实现了包括LLaMA,ChatGLM,BLOOM,GPT2,Seq2Seq,BART,T5,UDA等模型的训练和预测,开箱即用。
This project helps developers and researchers create custom text generation models for a variety of tasks. You provide a dataset of text examples, and it outputs a fine-tuned model capable of generating new, similar text. This is ideal for machine learning engineers, data scientists, and AI researchers who need to adapt large language models for specific applications.
979 stars. No commits in the last 6 months. Available on PyPI.
Use this if you need to fine-tune a pre-existing text generation model (like GPT, LLaMA, or T5) with your specific data to improve its performance for a niche task, such as generating medical responses or Chinese couplets.
Not ideal if you are looking for an off-the-shelf application to generate text without any coding or model customization.
Stars
979
Forks
112
Language
Python
License
Apache-2.0
Category
Last pushed
Sep 14, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/shibing624/textgen"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
tabularis-ai/be_great
A novel approach for synthesizing tabular data using pretrained large language models
EleutherAI/gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron...
ai-forever/ru-gpts
Russian GPT3 models.
AdityaNG/kan-gpt
The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold...
zemlyansky/gpt-tfjs
GPT in TensorFlow.js