The research article focuses on Bayesian ensembles and learning optimal combinations of Bayesian models in an online, continual learning setting.
The study reinterprets approaches like Bayesian model averaging (BMA) and Bayesian stacking using empirical Bayes lens, highlighting the limitations of BMA.
A new method called Online Bayesian Stacking (OBS) is introduced, which optimizes log-score over predictive distributions to adaptively combine Bayesian models, showing better performance in certain scenarios than online BMA.
The work establishes a connection between OBS and portfolio selection, offering efficient algorithms and regret analysis, providing guidance on when to prefer OBS over online BMA.