menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Balanced G...
source image

Arxiv

1w

read

184

img
dot

Image Credit: Arxiv

Balanced Gradient Sample Retrieval for Enhanced Knowledge Retention in Proxy-based Continual Learning

  • Continual learning in deep neural networks often suffers from catastrophic forgetting, where representations for previous tasks are overwritten during subsequent training.
  • A novel sample retrieval strategy is proposed that leverages both gradient-conflicting and gradient-aligned samples to retain knowledge about past tasks.
  • Gradient-conflicting samples are selected to reduce interference and re-align gradients, preserving past task knowledge.
  • Experiments validate the method's state-of-the-art performance in mitigating forgetting and maintaining competitive accuracy on new tasks.

Read Full Article

like

11 Likes

For uninterrupted reading, download the app