menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

DDIL: Dive...
source image

Arxiv

2d

read

185

img
dot

Image Credit: Arxiv

DDIL: Diversity Enhancing Diffusion Distillation With Imitation Learning

  • Diffusion models excel at generative modeling but have limitations in sampling due to multiple denoising network passes.
  • Co-variate shift is identified as a reason for poor performance of multi-step distilled models.
  • To address co-variate shift, the researchers propose a diffusion distillation within an imitation learning framework (DDIL).
  • DDIL enhances training distribution for distilling diffusion models, improving performance and stability.

Read Full Article

like

11 Likes

For uninterrupted reading, download the app