MNO (Mamba Neural Operator) is a novel framework that enhances neural operator-based techniques for solving PDEs.
MNO establishes a theoretical connection between structured state-space models (SSMs) and neural operators, offering a unified structure that can adapt to diverse architectures.
MNO captures long-range dependencies and continuous dynamics more effectively than traditional Transformers, making it a superior framework for PDE-related tasks.
Through extensive analysis, MNO has been shown to significantly boost the expressive power and accuracy of neural operators.