AGPatriota/GPT-2-for-R
A GPT-2 for R with the OpenAI trained weights
This project helps R users generate human-like text by providing a Generative Pre-Trained Transformer (GPT-2) model. You input a starting phrase or sentence, and the model outputs a continuation of that text. This tool is ideal for researchers, data scientists, or anyone working in R who needs to experiment with large language models for text generation.
No commits in the last 6 months.
Use this if you are an R user looking to explore text generation capabilities within your R environment, inputting a prompt to receive a coherent text continuation.
Not ideal if you are looking for a plug-and-play solution for production-grade applications or if you need to train a custom language model on your own data.
Stars
30
Forks
2
Language
R
License
—
Category
Last pushed
May 27, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/AGPatriota/GPT-2-for-R"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
elevenlabs/elevenlabs-js
The official JavaScript (Node) library for the ElevenLabs API.
rish-16/gpt2client
✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer...
shaRk-033/ai.c
gpt written in plain c
brucetruth/minigpt
A minimal, hackable, CPU-first GPT implementation in pure Go
soumyadip1995/TextBrain
A tool that is built using several open source services and uses Open AI's GPT-2 as a base model.