Graph Contrastive Learning (GCL) focuses on aligning positive pairs and separating negative ones to learn node representations.This paper addresses the connection between augmentation and downstream performance in GCL.Findings reveal that GCL mainly contributes to downstream tasks by separating different classes.Perfect alignment and augmentation overlap may not lead to the best downstream performance, so specifically designed augmentations are needed.