menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Entropic b...
source image

Arxiv

6d

read

12

img
dot

Image Credit: Arxiv

Entropic bounds for conditionally Gaussian vectors and applications to neural networks

  • Using entropic inequalities from information theory, new bounds on the total variation and 2-Wasserstein distances between conditionally Gaussian and Gaussian laws are provided.
  • The results are applied to quantify the convergence speed of a randomly initialized fully connected neural network and its derivatives to Gaussian distributions.
  • The findings improve and extend previous research studies on the subject.
  • Cumulant estimates and activation function assumptions play a crucial role in the results.

Read Full Article

like

Like

For uninterrupted reading, download the app