menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Revisiting...
source image

Arxiv

1d

read

361

img
dot

Image Credit: Arxiv

Revisiting the Equivalence of Bayesian Neural Networks and Gaussian Processes: On the Importance of Learning Activations

  • Gaussian Processes (GPs) are useful for modeling uncertainty with function-space priors, while Bayesian Neural Networks (BNNs) are more scalable but lack some GP advantages.
  • Efforts have been made to make BNNs behave like GPs, but previous solutions have limitations.
  • A study shows that using trainable activations is essential to map GP priors effectively to wide BNNs.
  • The closed-form 2-Wasserstein distance is used for efficient optimization of reparameterized priors and activations.
  • The method introduces trainable periodic activations for global stationarity and functional priors conditioned on GP hyperparameters for efficient model selection.
  • Empirical results demonstrate that the proposed method outperforms existing approaches and matches heuristic methods with stronger theoretical foundations.

Read Full Article

like

21 Likes

For uninterrupted reading, download the app