menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

On the Loc...
source image

Arxiv

13h

read

159

img
dot

Image Credit: Arxiv

On the Local Complexity of Linear Regions in Deep ReLU Networks

  • Researchers have defined the local complexity of a neural network with continuous piecewise linear activations as a measure of the density of linear regions over an input data distribution.
  • They have theoretically shown that ReLU networks learning low-dimensional feature representations have a lower local complexity.
  • This connects recent empirical observations on feature learning with concrete properties of the learned functions.
  • The local complexity also serves as an upper bound on the total variation of the function over the input data distribution, linking feature learning to adversarial robustness.
  • The researchers also consider how optimization drives ReLU networks towards solutions with lower local complexity, contributing a theoretical framework for understanding geometric properties of ReLU networks in relation to learning.

Read Full Article

like

9 Likes

For uninterrupted reading, download the app