menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Generaliza...
source image

Arxiv

2d

read

43

img
dot

Image Credit: Arxiv

Generalization Bounds and Stopping Rules for Learning with Self-Selected Data

  • Learning paradigms like active learning, semi-supervised learning, bandits, or boosting self-select training data based on previously learned parameters.
  • Reciprocal learning unifies these paradigms, and this article focuses on the generalization ability of methods using self-selected samples.
  • The article presents universal generalization bounds for reciprocal learning, using covering numbers and Wasserstein ambiguity sets without assumptions on the data distribution.
  • Results are provided for both convergent and finite iteration solutions, offering anytime valid stopping rules for practitioners to ensure out-of-sample performance, illustrated through the semi-supervised learning case.

Read Full Article

like

1 Like

For uninterrupted reading, download the app