menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Communicat...
source image

Arxiv

1d

read

42

img
dot

Image Credit: Arxiv

Communication-Efficient and Personalized Federated Foundation Model Fine-Tuning via Tri-Matrix Adaptation

  • A new method called CE-LoRA (communication-efficient federated LoRA adaptation) has been introduced to address challenges in fine-tuning pre-trained foundation models in federated learning.
  • CE-LoRA utilizes a tri-factorization low-rank adaptation approach with personalized model parameter aggregation.
  • By introducing a small-size dense matrix and considering client similarity, CE-LoRA reduces communication cost and achieves comparable empirical performance.
  • Experiments show that CE-LoRA significantly reduces communication overhead, improves performance under non-iid data conditions, and enhances data privacy protection.

Read Full Article

like

2 Likes

For uninterrupted reading, download the app