Federated Neural Architecture Search (NAS) enables collaborative search for optimal model architectures tailored to heterogeneous data to achieve higher accuracy.
FedMetaNAS is a framework that integrates meta-learning with NAS in the context of Federated Learning (FL) to accelerate architecture search by pruning the search space and eliminating the retraining stage.
It utilizes the Gumbel-Softmax reparameterization and Model-Agnostic Meta-Learning techniques to facilitate relaxation of mixed operations and adapt weights and architecture parameters for individual tasks.
Experimental evaluations demonstrate that FedMetaNAS significantly accelerates the search process by more than 50% with higher accuracy compared to FedNAS.