juliusberner/sde_sampler

Improved sampling via learned diffusions (ICLR2024) and an optimal control perspective on diffusion-based generative modeling (TMLR2024)

39
/ 100
Emerging

This project helps researchers and practitioners generate diverse, high-quality samples from complex, unnormalized probability distributions. You provide an unnormalized target density, and the project outputs samples that follow that distribution, enabling tasks like Monte Carlo simulations or generative modeling. It's designed for quantitative analysts, statisticians, or machine learning researchers working with intricate data distributions.

No commits in the last 6 months.

Use this if you need to generate samples from a probability distribution whose mathematical form is known but whose normalization constant is intractable, and you're looking for advanced, diffusion-based sampling techniques.

Not ideal if you're looking for a simple, off-the-shelf sampling method for standard distributions or if you don't have a technical understanding of stochastic differential equations and neural network-based control.

generative-modeling stochastic-processes computational-statistics quantitative-analysis probability-density-estimation
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 16 / 25
Community 14 / 25

How are scores calculated?

Stars

74

Forks

10

Language

Python

License

MIT

Last pushed

Mar 14, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/juliusberner/sde_sampler"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.