menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

CoMoE: Con...
source image

Arxiv

1w

read

0

img
dot

Image Credit: Arxiv

CoMoE: Contrastive Representation for Mixture-of-Experts in Parameter-Efficient Fine-tuning

  • Mixture-of-experts (MoE) is a popular approach in parameter-efficient fine-tuning to balance model capacity and computation overhead.
  • Current MoE variants face challenges with heterogeneous datasets due to underutilization of experts' capacity.
  • CoMoE, Contrastive Representation for MoE, is introduced to enhance modularization and specialization within MoE by incorporating a contrastive objective.
  • Experiments show that CoMoE improves MoE's capacity and promotes modularization among experts across various benchmarks and multi-task settings.

Read Full Article

like

Like

For uninterrupted reading, download the app