Retraining a model using its own predictions and original labels can improve model performance.This paper focuses on optimizing the combination of model predictions and labels for binary classification tasks.A framework based on approximate message passing (AMP) is developed for analyzing retraining procedures.The study derives the Bayes optimal aggregator function for retraining models, showing its superiority in minimizing prediction error.