menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Hybrid Bat...
source image

Arxiv

3d

read

302

img
dot

Image Credit: Arxiv

Hybrid Batch Normalisation: Resolving the Dilemma of Batch Normalisation in Federated Learning

  • Batch Normalization (BN) is commonly used in traditional deep neural network training to align input-output distributions for each batch of data.
  • Federated learning, a distributed learning approach, struggles with non-independent and non-identically distributed data among client nodes, making updating BN parameters challenging.
  • Hybrid Batch Normalization (HBN) is introduced as a tailored normalization solution for federated learning, separating statistical parameter updates from learnable parameter updates to improve performance.
  • HBN includes a learnable hybrid distribution factor that enables computing nodes to dynamically mix current batch statistics with global statistics, enhancing federated learning across various scenarios.

Read Full Article

like

18 Likes

For uninterrupted reading, download the app