menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Backward P...
source image

Medium

1w

read

34

img
dot

Backward Propagation is here to stay part1(AI 2024)

  • Training deep neural networks typically involves substantial computational costs during both forward and backward propagation.
  • Dropping Backward Propagation (DropBP) is a novel approach designed to reduce computational costs while maintaining accuracy.
  • DropBP randomly drops layers during backward propagation without affecting forward propagation.
  • Utilizing DropBP can reduce training time, increase convergence speed, and enable training with a larger sequence length.

Read Full Article

like

2 Likes

For uninterrupted reading, download the app