Graph learning, a key problem in graph signal processing, is crucial for applying the Fourier transform to non-Euclidean domains with unknown structures.
There is a growing interest in product graphs for modeling dependencies across different data dimensions, but current methods are limited in modeling diverse dependency structures.
This paper examines learning a Kronecker-structured product graph from smooth signals, which is more intricate than the Cartesian product but poses challenges in optimization.
An alternating optimization scheme is proposed to address the non-convex graph learning problem, with theoretical guarantees for convergence and superior performance shown in experiments.