Mobile edge computing (MEC) networks face the challenge of efficiently handling diverse machine learning tasks generated by mobile users.The traditional approach of offloading tasks to the nearest available edge server can lead to overfitting or forgetting of previous tasks.To address this, the mixture-of-experts (MoE) theory is introduced in MEC networks to improve continual learning (CL) performance.An adaptive gating network is used to route tasks to specialized experts, reducing generalization error over time.