Diffusion models excel at generative modeling but have limitations in sampling due to multiple denoising network passes.Co-variate shift is identified as a reason for poor performance of multi-step distilled models.To address co-variate shift, the researchers propose a diffusion distillation within an imitation learning framework (DDIL).DDIL enhances training distribution for distilling diffusion models, improving performance and stability.