The forward-only diffusion (FoD) approach for generative modelling is presented, which directly learns data generation through a single forward diffusion process.
FoD utilizes a state-dependent linear stochastic differential equation with mean-reverting terms in drift and diffusion functions, ensuring convergence to clean data and simulating stochastic interpolation between source and target distributions.
Despite its simplicity, FoD achieves competitive performance on image-conditioned and unconditional generation tasks, showing effectiveness in generative modelling.
FoD model is analytically tractable, trained using a stochastic flow matching objective, and enables non-Markov chain sampling during inference. Code available at https://github.com/Algolzw/FoD.