Asynchronous Federated Learning (AFL) allows model training on multiple mobile devices independently.
Device mobility leads to intermittent connectivity, requiring gradient sparsification and causing model staleness.
A theoretical model is developed to analyze the impact of sparsification, model staleness, and mobility on AFL convergence.
A mobility-aware dynamic sparsification (MADS) algorithm is proposed to optimize sparsification based on contact time and model staleness, improving convergence and achieving better results in experiments.