Contextual Self-Modulation (CSM) is a regularization mechanism for Neural Context Flows (NCFs) known for powerful meta-learning on physical systems.
CSM has limitations across different modalities and in high-data regimes.
Two extensions have been introduced in this work: iCSM, which expands CSM to infinite-dimensional variations, and StochasticNCF, which provides a low-cost approximation of meta-gradient updates.
The extensions were tested on tasks such as dynamical systems, computer vision challenges, and curve fitting problems.
Incorporating higher-order Taylor expansions showed that they do not necessarily improve generalization.
CSM can be integrated into other meta-learning frameworks with FlashCAVIA.
The study emphasizes the benefits of CSM for meta-learning and out-of-distribution tasks, particularly suited for physical systems.
An open-source library for integrating self-modulation into contextual meta-learning workflows is available at https://github.com/ddrous/self-mod.