Variable selection in high-dimensional sparse learning with group structures is challenging.
Group SLOPE is effective for adaptive selection of predictor groups but faces issues with block non-separable group effects.
Existing methods are either invalid or inefficient in handling these effects, leading to high computational costs and memory usage.
A new safe screening rule tailored for Group SLOPE efficiently identifies inactive groups with zero coefficients by addressing block non-separable group effects.
By excluding inactive groups during training, significant gains in computational efficiency and memory usage are achieved.
The screening rule can be seamlessly integrated into existing solvers for both batch and stochastic algorithms.
Theoretically, the screening rule can be safely employed with existing optimization algorithms, ensuring the same results as the original approaches.
Experimental results show that the method detects inactive feature groups effectively, enhancing computational efficiency without compromising accuracy.