menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

A Pre-Trai...
source image

Arxiv

4d

read

226

img
dot

Image Credit: Arxiv

A Pre-Training and Adaptive Fine-Tuning Framework for Graph Anomaly Detection

  • Graph anomaly detection (GAD) is challenging due to scarcity of abnormal nodes and high cost of label annotations.
  • Graph pre-training has emerged as an effective approach for label-efficient learning in GAD, but the mix of homophily and heterophily in anomalies requires selective filters for individual nodes.
  • The PAF framework, Pre-Training and Adaptive Fine-tuning, is proposed to address the challenges in GAD by implementing joint training with low- and high-pass filters in the pre-training phase, and using a gated fusion network during fine-tuning.
  • Experiments on ten benchmark datasets consistently demonstrate the effectiveness of PAF in graph anomaly detection.

Read Full Article

like

13 Likes

For uninterrupted reading, download the app