Federated learning (FL) is a promising paradigm in distributed learning while preserving user privacy.
The increasing size of models makes it difficult for users with limited resources to participate in FL.
The authors propose GeFL, a model-agnostic federated learning approach that incorporates a generative model to aggregate global knowledge across users with heterogeneous models.
Experimental results show improved performance of GeFL compared to baselines, along with a novel framework GeFL-F that addresses privacy and scalability concerns.