Neural networks have shown great performance in supervised learning but collecting annotated data is expensive. Semi-supervised learning (SSL) provides a solution by using consistency regularization and pseudo-labeling techniques.
Recent SSL methods like Semi-ViT and Noisy Student have made significant progress, but face challenges in selecting high-quality pseudo-labels due to fixed thresholds.
New methods like FlexMatch and FreeMatch introduced flexible thresholding techniques, but updating thresholds at each iteration is time-consuming. SST proposes Self-training with Self-adaptive Thresholding, which adjusts class-specific thresholds based on learning progress for efficient SSL.
Extensive experiments show that SST achieves state-of-the-art performance in SSL with high efficiency and scalability. Semi-SST-ViT-Huge outperforms DeiT-III-ViT-Huge by achieving superior results using only 10% labeled data on ImageNet-1K SSL benchmarks.