menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Temperatur...
source image

Arxiv

3d

read

47

img
dot

Image Credit: Arxiv

Temperature Optimization for Bayesian Deep Learning

  • The Cold Posterior Effect (CPE) in Bayesian Deep Learning (BDL) involves tempering the posterior to a cold temperature to enhance the predictive performance of the posterior predictive distribution (PPD).
  • Despite the assumption that colder temperatures are always better, the BDL community acknowledges that this is not consistently true, lacking a systematic method for determining the optimal temperature.
  • A data-driven approach is suggested in this study to select the temperature maximizing test log-predictive density by treating temperature as a model parameter and estimating it directly from the data.
  • The proposed method is shown to offer comparable performance to grid search but at a reduced cost for regression and classification tasks in empirical demonstrations.
  • There is a contrast in the perspectives on CPE between the BDL and Generalized Bayes communities, with the former emphasizing PPD predictive performance and the latter stressing the posterior utility under model misspecification, leading to differing temperature preferences.

Read Full Article

like

2 Likes

For uninterrupted reading, download the app