One global model in federated learning (FL) might not be sufficient to serve many clients with non-IID tasks and distributions.
The paper proposes a novel approach called FedMerge that can create a personalized model per client by merging multiple global models with optimized weights.
FedMerge allows a few global models to serve many non-IID clients without requiring further local fine-tuning.
The approach outperforms existing FL methods across different non-IID settings in terms of performance.