Li-Jinsong/DAEDAL

[ICLR 2026] Official repository of "Beyond Fixed: Training-Free Variable-Length Denoising for Diffusion Large Language Models"

43
/ 100
Emerging

This project helps developers working with Diffusion Large Language Models (DLLMs) overcome a significant limitation: the need to pre-define the output length for every text generation task. Instead of manually guessing or setting a fixed length, this tool allows DLLMs to dynamically determine the appropriate response length on the fly. It takes a text prompt and generates a response that is precisely as long as needed, eliminating truncation for complex tasks and reducing unnecessary computation for simple ones.

162 stars.

Use this if you are a machine learning engineer or researcher developing with Diffusion Large Language Models and need to generate variable-length text outputs without performance trade-offs or manual length tuning.

Not ideal if you are working with Autoregressive Large Language Models or if your text generation tasks consistently require fixed-length outputs.

Diffusion Models Large Language Models NLP Development Text Generation Model Inference Optimization
No Package No Dependents
Maintenance 10 / 25
Adoption 10 / 25
Maturity 15 / 25
Community 8 / 25

How are scores calculated?

Stars

162

Forks

6

Language

Python

License

Apache-2.0

Last pushed

Feb 16, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/diffusion/Li-Jinsong/DAEDAL"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.