menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Towards Ad...
source image

Arxiv

1d

read

263

img
dot

Image Credit: Arxiv

Towards Adversarially Robust Dataset Distillation by Curvature Regularization

  • Dataset distillation (DD) allows datasets to be distilled to fractions of their original size while preserving the rich distributional information so that models trained on the distilled datasets can achieve a comparable accuracy while saving significant computational loads.
  • This paper explores a new perspective of dataset distillation by embedding adversarial robustness, enabling models trained on these datasets to maintain high accuracy and better adversarial robustness.
  • The proposed method incorporates curvature regularization into the distillation process, resulting in improved accuracy and robustness compared to standard adversarial training, with lower computational overhead.
  • Empirical experiments demonstrate that the method generates robust distilled datasets capable of withstanding various adversarial attacks.

Read Full Article

like

15 Likes

For uninterrupted reading, download the app