menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Perturbati...
source image

Arxiv

2d

read

23

img
dot

Image Credit: Arxiv

Perturbative Gradient Training: A novel training paradigm for bridging the gap between deep neural networks and physical reservoir computing

  • Perturbative Gradient Training (PGT) is a new training paradigm introduced to address a limitation in physical reservoir computing.
  • PGT uses random perturbations in the network's parameter space to approximate gradient updates through forward passes, overcoming the inability to perform backpropagation in physical reservoirs.
  • The feasibility of PGT was demonstrated on both simulated neural network architectures and experimental hardware using a magnonic auto-oscillation ring as the physical reservoir.
  • Results indicate that PGT can achieve performance levels comparable to standard backpropagation methods in scenarios where backpropagation is impractical or impossible, showing promise in integrating physical reservoirs into deeper neural network architectures for improved energy efficiency in AI training.

Read Full Article

like

1 Like

For uninterrupted reading, download the app