menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

A Self-Sup...
source image

Arxiv

1w

read

25

img
dot

Image Credit: Arxiv

A Self-Supervised Paradigm for Data-Efficient Medical Foundation Model Pre-training: V-information Optimization Framework

  • Self-supervised pre-training of medical foundation models on large-scale datasets is a common approach for achieving good performance.
  • Current methods for increasing pre-training data volume do not necessarily improve model performance.
  • The introduction of V-information in self-supervised pre-training provides a theoretical foundation for sample selection.
  • OptiDEL, an optimized data-effective learning method, outperforms existing approaches on multiple datasets by using 20x less training data.

Read Full Article

like

1 Like

For uninterrupted reading, download the app