Researchers introduce Curriculum Negative Mining (CurNM) to address challenges in training Temporal Graph Neural Networks (TGNNs).
CurNM is a model-aware curriculum learning framework that adapts the difficulty of negative samples by balancing random, historical, and hard negatives.
The framework includes a dynamically updated negative pool to overcome challenges like positive sparsity and positive shift in temporal networks.
Experiments on 12 datasets and 3 TGNNs show that CurNM outperforms baseline methods significantly, with thorough ablation studies confirming its usefulness and robustness.