Multi-Modal Learning (MML) aims to integrate information from diverse modalities for better predictive accuracy.
Existing methods aggregate gradients with fixed weights, overlooking the gradient uncertainty of each modality.
BOGC-MML is introduced as a Bayesian-Oriented Gradient Calibration method to address gradient uncertainties and optimize model direction.
The method models each modality's gradient as a random variable, quantifies uncertainties, and balances sensitivity and conservatism across dimensions.