Output uncertainty indicates whether the probabilistic properties reflect objective characteristics of the model output.
A post-processing parametric calibration method called $
ho$-Norm Scaling is introduced to mitigate overconfidence in limited data sets.
The method expands the calibrator expression to preserve accuracy while reducing excessive amplitude.
Probability distribution regularization is included to ensure the instance-level uncertainty distribution resembles the distribution before calibration.