Autoregressive next-step prediction models are widely used for building data-driven neural solvers for time-dependent partial differential equations (PDEs).
A new approach proposes a latent diffusion model for PDE simulation, reducing computational costs.
Using an autoencoder, different types of meshes are mapped onto a unified structured latent grid, enabling the capture of complex geometries.
The proposed model outperforms deterministic baselines in accuracy and long-term stability, demonstrating the potential of diffusion-based approaches for robust data-driven PDE learning.