Soft Contrastive Variational Inference (SoftCVI) is introduced, allowing a family of variational objectives to be derived through a contrastive estimation framework.
SoftCVI reframes the inference task as a contrastive estimation problem and does not require positive or negative samples.
SoftCVI learns by sampling the variational distribution and computing ground truth soft classification labels from the unnormalized posterior itself.
Empirical investigation shows that SoftCVI can form stable and effective objectives for Bayesian inference tasks, frequently outperforming other variational approaches.