menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Relevance-...
source image

Arxiv

1d

read

379

img
dot

Image Credit: Arxiv

Relevance-driven Input Dropout: an Explanation-guided Regularization Technique

  • Overfitting is a common issue in Machine Learning models, even among state-of-the-art models, leading to reduced generalization and a significant performance gap between training and testing sets.
  • To address overfitting, various techniques like dropout, data augmentation, and weight decay are used for regularization.
  • A new data augmentation method called Relevance-driven Input Dropout (RelDrop) is proposed, which selectively occludes the most relevant regions of the input to improve model generalization through informed regularization.
  • Experiments on benchmark datasets show that RelDrop enhances robustness towards occlusion, encourages models to use more features in prediction, and improves generalization performance during inference.

Read Full Article

like

22 Likes

For uninterrupted reading, download the app