Personalized federated learning has become popular for training on devices with different data, but typically requires labeled data for personalization.
FLowDUP is a new method proposed in this paper that can generate personalized models using only unlabeled data with a forward pass.
The model parameters generated by FLowDUP are in a low-dimensional subspace, allowing for efficient communication and computation.
Experimental evaluation of FLowDUP shows strong empirical performance on various datasets with unlabeled clients, supported by theoretical results.