LrcSSM is a nonlinear recurrent model designed for efficient sequence modeling, capable of processing long sequences quickly.
The model achieves parallel processing of full sequences with a single prefix-scan, leading to optimal time and memory complexity.
LrcSSM provides a formal gradient-stability guarantee that sets it apart from other systems like Liquid-S4 and Mamba, making it reliable for training.
In comparison to quadratic-attention Transformers, LrcSSM shows superior performance on long-range forecasting tasks, outperforming models like LRU, S5, and Mamba.