menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Fast Conve...
source image

Arxiv

1w

read

393

img
dot

Image Credit: Arxiv

Fast Convex Optimization for Two-Layer ReLU Networks: Equivalent Model Classes and Cone Decompositions

  • Researchers have developed fast algorithms and robust software for convex optimization of two-layer neural networks with ReLU activation functions.
  • The work leverages a convex reformulation of the weight-decay penalized training problem as a set of group-ℓ₁-regularized data-local models, utilizing polyhedral cone constraints.
  • In the case of zero-regularization, the problem is exactly equivalent to unconstrained optimization of a convex 'gated ReLU' network with non-singular gates.
  • The developed approaches outperform standard training heuristics and commercial interior-point solvers in terms of speed and performance.

Read Full Article

like

23 Likes

For uninterrupted reading, download the app