Biological brains learn continually from unlabeled data and integrate specialized information from sparsely labeled examples without compromising generalization.
Machine learning methods face catastrophic forgetting in natural learning settings, where supervised specialist fine-tuning degrades performance on original tasks.
Task-modulated contrastive learning (TMCL) is introduced, inspired by neocortical biophysical machinery, integrating top-down information continually and unsupervisedly using predictive coding principles.
Experiments demonstrate improvements in class-incremental and transfer learning over state-of-the-art unsupervised methods, suggesting the importance of top-down modulations in balancing stability and plasticity.