Research introduces Gradient Contribution (GC) method for efficient few-shot Graph Neural Architecture Search, addressing weight coupling problem in NAS.
GC method computes cosine similarity of gradient directions among modules to allocate modules to sub-supernets based on conflicting or similar directions.
Unified Graph Neural Architecture Search (UGAS) framework explores optimal combinations of Message Passing Neural Networks (MPNNs) and Graph Transformers (GTs).
Experimental results show that GC improves supernet partitioning quality and time efficiency, while architectures found by UGAS+GC outperform manually designed GNNs and existing NAS methods.