A novel framework is proposed for heterogeneous federated learning (FL) to address client heterogeneity and improve model performance.
The framework captures local and global training processes through a bilevel formulation.
It includes personalized learning, pre-training on the server's side, nonstandard aggregation, nonidentical local steps, and clients' local constraints.
The proposed method, ZO-HFL, achieves nonasymptotic and asymptotic convergence guarantees without relying on standard assumptions in heterogeneous FL.