MadryLab/journey-TRAK

Code for the paper "The Journey, Not the Destination: How Data Guides Diffusion Models"

30
/ 100
Emerging

When you generate an image using a diffusion model, this tool helps you understand exactly which training images influenced its creation. You provide a generated image and the diffusion model used, and it tells you which specific training examples were most responsible for different parts of the generation process. This is for researchers and practitioners working with generative AI who need to analyze and debug their diffusion models.

No commits in the last 6 months.

Use this if you need to trace the influence of individual training data points on the output of a diffusion model, for example, to understand model bias or data leakage.

Not ideal if you are looking for a tool to train diffusion models or perform general image generation tasks.

generative-AI diffusion-models model-explainability AI-auditing data-attribution
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 16 / 25
Community 7 / 25

How are scores calculated?

Stars

25

Forks

2

Language

Python

License

MIT

Last pushed

Dec 12, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/MadryLab/journey-TRAK"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.