Conventional artificial deep neural networks operating near the phase boundary of signal propagation dynamics exhibit universal scaling laws in non-equilibrium statistical mechanics.
Multilayer perceptrons and convolutional neural networks belong to the mean-field and directed percolation universality classes, respectively.
Finite-size scaling suggests a potential connection to the depth-width trade-off in deep learning.
Hyperparameter tuning to the phase boundary is necessary but insufficient for achieving optimal generalization in deep networks, indicating the importance of nonuniversal metric factors.