Message-Passing Neural Networks (MPNNs) are widely used for processing graph-structured data but often face limitations such as over-squashing of long-range dependencies in the output.
Researchers have identified a problem similar to the Effective Receptive Field (ERF) in Convolutional Neural Networks in MPNNs, where theoretical potential is underutilized.
A new architecture called Interleaved Multiscale Message-Passing Neural Networks (IM-MPNN) has been proposed to enhance MPNN performance by enabling message-passing across multiscale representations for better capture of long-range interactions.
Extensive evaluations, including on the Long-Range Graph Benchmark (LRGB), show significant improvements in capturing long-range dependencies while maintaining computational efficiency compared to baseline MPNNs.