Batch Normalization (BN) is commonly used in traditional deep neural network training to align input-output distributions for each batch of data.
Federated learning, a distributed learning approach, struggles with non-independent and non-identically distributed data among client nodes, making updating BN parameters challenging.
Hybrid Batch Normalization (HBN) is introduced as a tailored normalization solution for federated learning, separating statistical parameter updates from learnable parameter updates to improve performance.
HBN includes a learnable hybrid distribution factor that enables computing nodes to dynamically mix current batch statistics with global statistics, enhancing federated learning across various scenarios.