shreyansh26/LLM-Sampling
A collection of various LLM sampling methods implemented in pure Pytorch
This tool helps machine learning engineers and researchers fine-tune how large language models generate text. You input a prompt and select a generation method, and it outputs text generated by a Hugging Face model, allowing you to experiment with different sampling strategies to control creativity and coherence. It's designed for those who want to precisely control text generation for specific applications.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher who needs to experiment with different text generation strategies for large language models to achieve specific output characteristics like creativity, factual accuracy, or JSON adherence.
Not ideal if you are looking for a no-code solution or simply want to use an LLM without needing to understand or control the underlying sampling mechanisms.
Stars
28
Forks
3
Language
Python
License
—
Category
Last pushed
Dec 09, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/shreyansh26/LLM-Sampling"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
rasbt/LLMs-from-scratch
Implement a ChatGPT-like LLM in PyTorch from scratch, step by step
facebookresearch/LayerSkip
Code for "LayerSkip: Enabling Early Exit Inference and Self-Speculative Decoding", ACL 2024
FareedKhan-dev/train-llm-from-scratch
A straightforward method for training your LLM, from downloading data to generating text.
kmeng01/rome
Locating and editing factual associations in GPT (NeurIPS 2022)
datawhalechina/llms-from-scratch-cn
仅需Python基础,从0构建大语言模型;从0逐步构建GLM4\Llama3\RWKV6, 深入理解大模型原理