Model merging combines multiple expert models into a single multi-task model to reduce storage and computational resources.
Recent model merging methods struggle to maintain performance gains with an increasing number of merged models.
Theoretical analysis suggests there is an upper bound on model merging due to limited effective parameter space.
The study introduces a Reparameterized Heavy-Tailed method to enhance the performance of merged models and validates the findings on various benchmarks.