menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Disentangl...
source image

Arxiv

1d

read

185

img
dot

Image Credit: Arxiv

Disentangled Feature Importance

  • Standard methods for feature importance quantification underestimate contributions when predictors are correlated.
  • Introduction of Disentangled Feature Importance (DFI) aims to address this limitation by transforming correlated features into independent latent variables using a transport map.
  • DFI provides a principled decomposition of importance scores that sum to the total predictive variability for latent additive models and interaction-weighted functional ANOVA variances under arbitrary feature dependencies.
  • Comprehensive semiparametric theory for DFI establishes root-n consistency and asymptotic normality of importance estimators in the latent space, achieving computational efficiency.

Read Full Article

like

11 Likes

For uninterrupted reading, download the app