<ul data-eligibleForWebStory="true">Physics-informed neural operators (PINOs) are effective for learning solution operators of PDEs.Recent research has shown that incorporating Lie point symmetry information can boost the training efficiency of PINOs.Techniques like data, architecture, and loss augmentation are used to integrate Lie point symmetries.However, traditional point symmetries can sometimes offer no training signal, limiting their effectiveness in certain problems.To overcome this limitation, a novel loss augmentation strategy is proposed in this work.The strategy leverages evolutionary representatives of point symmetries, a type of generalized symmetries of the underlying PDE.Generalized symmetries provide a more extensive set of generators than standard symmetries, offering a more informative training signal.By using evolutionary representatives, the performance of neural operators is enhanced, leading to better data efficiency and accuracy in training.