In the paradigm of decentralized learning, a group of agents collaborates to learn a global model using distributed datasets without a central server.
The heterogeneity of the local data across agents makes learning a robust global model challenging.
PDSL is a privacy-preserved decentralized stochastic learning algorithm that addresses these challenges using Shapley values to measure neighbor contributions and differential privacy to prevent privacy leakage.
The PDSL algorithm demonstrates efficacy in privacy preservation and convergence, supported by theoretical analysis and extensive experiments.