menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Weighted L...
source image

Arxiv

2d

read

264

img
dot

Image Credit: Arxiv

Weighted Loss Methods for Robust Federated Learning under Data Heterogeneity

  • Federated learning (FL) enables multiple data holders to train a ML model without sharing data externally.
  • FL involves workers updating a model locally and sharing their gradients with a central server.
  • Byzantine-resilient FL prevents malicious participants from harming model convergence.
  • Common strategies in FL ignore outlier gradients to thwart attacks.
  • In heterogeneous data settings, distinguishing outliers is challenging.
  • A new approach, Worker Label Alignement Loss (WoLA), aligns honest worker gradients in heterogeneous data settings.
  • WoLA helps in identifying malicious gradients and outperforms existing methods in such settings.
  • The paper includes theoretical insights and empirical evidence supporting WoLA's effectiveness.

Read Full Article

like

15 Likes

For uninterrupted reading, download the app