MagedSaeed/generate-sequences
A python package made to generate sequences (greedy and beam-search) from Pytorch (not necessarily HF transformers) models.
This library helps deep learning practitioners generate text or sequences from their PyTorch models, even if they aren't Hugging Face compatible. It takes your custom PyTorch model and an input, then outputs the generated sequences using either greedy search or beam search decoding strategies. It's for machine learning engineers, researchers, and data scientists building custom sequence generation systems.
Available on PyPI.
Use this if you need to generate sequences (like text, code, or data streams) from a custom PyTorch deep learning model and want flexible control over the decoding process.
Not ideal if you are exclusively working with Hugging Face models, as their built-in generation methods are already optimized for their ecosystem.
Stars
18
Forks
2
Language
Python
License
Apache-2.0
Category
Last pushed
Dec 12, 2025
Commits (30d)
0
Dependencies
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/MagedSaeed/generate-sequences"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
worldbank/REaLTabFormer
A suite of auto-regressive and Seq2Seq (sequence-to-sequence) transformer models for tabular and...
tlkh/t2t-tuner
Convenient Text-to-Text Training for Transformers
NohTow/PPL-MCTS
Repository for the code of the "PPL-MCTS: Constrained Textual Generation Through...
styfeng/TinyDialogues
Code & data for the EMNLP 2024 paper: Is Child-Directed Speech Effective Training Data for...
readme-generator/alreadyme-ai-serving
Serving large language model with transformers