Spurious correlations pose a challenge for robust real-world generalization in machine learning.
Existing methods address this issue by maximizing group-balanced or worst-group accuracy, but they heavily rely on expensive bias annotations.
A new method is proposed to tackle spurious correlations by reframing them as imbalances or mismatches in class-conditional distributions, eliminating the need for bias annotations or predictions.
The proposed method achieves class-conditional distribution balancing and produces a debiased data distribution for classification, delivering state-of-the-art performance.