Spiking neural networks (SNNs) rely on the timing of signals for representing and processing information.
A new method called DelGrad has been proposed to compute exact loss gradients for both synaptic weights and delays in SNNs.
DelGrad eliminates the need for tracking additional variables and offers higher precision, efficiency, and suitability for neuromorphic hardware.
Experimental results demonstrate the memory efficiency, accuracy benefits, and potential for stabilizing networks against noise in SNNs trained using DelGrad on a neuromorphic platform.