Cluster-wise Graph Transformer with Dual-granularity Kernelized AttentionThe paper introduces the Node-to-Cluster Attention (N2C-Attn) mechanism for graph learning.N2C-Attn incorporates techniques from Multiple Kernel Learning to capture information at both node and cluster levels.The resulting architecture, Cluster-wise Graph Transformer (Cluster-GT), outperforms other methods on graph-level tasks.