tqch/poisson-jump
Official Implementation of Paper "Learning to Jump: Thinning and Thickening Latent Counts for Generative Modeling" (ICML 2023)
This project offers a method for creating new, realistic data points from existing datasets, such as images or text. It takes your raw data, learns its underlying patterns, and then generates novel samples that resemble the original input. This is primarily useful for researchers and practitioners working with generative models, particularly in fields like machine learning research or computer vision.
No commits in the last 6 months.
Use this if you are a machine learning researcher or practitioner interested in advanced generative modeling techniques for count data or image synthesis.
Not ideal if you are looking for a straightforward, out-of-the-box solution for data augmentation or content generation without deep technical involvement.
Stars
10
Forks
1
Language
Python
License
MIT
Category
Last pushed
Jun 06, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/tqch/poisson-jump"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
xie-lab-ml/Golden-Noise-for-Diffusion-Models
[ICCV2025] The code of our work "Golden Noise for Diffusion Models: A Learning Framework".
yulewang97/ERDiff
[NeurIPS 2023 Spotlight] Official Repo for "Extraction and Recovery of Dpatio-temporal Structure...
UNIC-Lab/RadioDiff
This is the code for the paper "RadioDiff: An Effective Generative Diffusion Model for...
pantheon5100/pid_diffusion
This repository is the official implementation of the paper: Physics Informed Distillation for...
zju-pi/diff-sampler
An open-source toolbox for fast sampling of diffusion models. Official implementations of our...