Federated Learning (FL) is a framework for distributed machine learning that preserves privacy and enhances security.
Data heterogeneity poses a challenge for federated learning, as most methods overlook the adjustment of aggregation weights.
The proposed method, Federated learning with Adaptive Weight Aggregation (FedAWA), adjusts aggregation weights based on client vectors during the learning process.
FedAWA assigns higher weights to models whose updates align with the global optimization direction, improving stability and generalization of the global model.