Standard methods for feature importance quantification underestimate contributions when predictors are correlated.
Introduction of Disentangled Feature Importance (DFI) aims to address this limitation by transforming correlated features into independent latent variables using a transport map.
DFI provides a principled decomposition of importance scores that sum to the total predictive variability for latent additive models and interaction-weighted functional ANOVA variances under arbitrary feature dependencies.
Comprehensive semiparametric theory for DFI establishes root-n consistency and asymptotic normality of importance estimators in the latent space, achieving computational efficiency.