A new algorithm called Forward-Forward (FF) is introduced as an alternative to backpropagation (BP) for training neural networks.
The algorithm uses a novel goodness function, dimensionality compression, which incorporates second-order statistical structure by minimizing the effective dimensionality (ED) for clamped inputs and maximizing it across the sample distribution.
The proposed algorithm achieves competitive performance compared to other non-BP methods and demonstrates that noise can enhance generalization and improve inference in neural networks.
The findings contribute to the development of more biologically plausible learning algorithms and suggest a potential application in neuromorphic computing where stochasticity is utilized as a computational resource.