Federated Learning (FL) aims to find parameters that minimize the global objective, which combines local objectives weighted by data size.
Federated Averaging (FedAvg) is a fundamental FL algorithm that operates in rounds, where the server sends the current model to selected clients for local training.
FedAvg averages the updates from clients' local training, weighted by data size, to update the global model and reduce communication costs.
Pairing FL with techniques like Differential Privacy (DP) and secure computation enhances its robustness and potential for becoming a mainstream machine learning tool in the future.