menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Towards Mi...
source image

Arxiv

2d

read

213

img
dot

Image Credit: Arxiv

Towards Minimizing Feature Drift in Model Merging: Layer-wise Task Vector Fusion for Adaptive Knowledge Integration

  • Multi-task model merging techniques aim to combine knowledge from task-specific experts into a unified model efficiently.
  • A new approach called Layer-wise Optimal Task Vector Merging (LOT Merging) is introduced to minimize feature drift during model merging.
  • LOT Merging minimizes feature differences between task-specific experts and the unified model in a layer-by-layer manner, enhancing performance.
  • Extensive experiments show that LOT Merging outperforms existing methods, achieving improvements of up to 4.4% on vision and vision-language benchmarks.

Read Full Article

like

12 Likes

For uninterrupted reading, download the app