#Diffusion Models
10 articles with this tag
Diffusion Models: Associative Memory with Creative Spark
UDDMs function as Associative Memories with emergent creativity. Conditional entropy signals the memorization-to-generalization transition, driven by dataset size.
MISTY: Single-Step Planning for Autonomous Driving
MISTY motion planner achieves SOTA closed-loop performance in autonomous driving via single-step inference, enabling proactive maneuvers at 99+ FPS.
HiVLA: Decoupling Reasoning for Robotic Control
HiVLA decouples VLM reasoning from motor control using a hierarchical framework, enhancing robotic manipulation performance and preserving zero-shot capabilities.
DMax: Parallel Decoding for Diffusion LLMs
DMax revolutionizes diffusion language models with Soft Parallel Decoding, boosting TPF significantly while preserving accuracy and achieving 1,338 TPS.
Drifting Models Revolutionize MRI-to-CT Synthesis
Drifting models outperform diffusion and traditional methods in MRI-to-CT synthesis, offering millisecond inference for efficient, high-quality pelvic imaging.
EdgeDiT: Transformers on the Edge
EdgeDiT brings high-fidelity generative AI to mobile devices by optimizing Diffusion Transformers for NPUs, achieving significant efficiency gains.

Stefano Ermon on Diffusion Models for Text
Stefano Ermon discusses the potential of diffusion models for text generation, highlighting their advantages in controllability and efficiency over traditional autoregressive models.
DCDP: Dynamic Diffusion Policies for Robotics
The DCDP framework enhances robotic adaptability in dynamic environments by integrating real-time environmental dynamics for improved action correction, achieving significant performance gains with minimal computational overhead.
NAP: Unlocking Parallel Generation in Diffusion Language Models
Researchers propose NAP, a data-centric approach to enable true parallel generation in Diffusion Language Models by aligning training data with non-autoregressive decoding.
