TransMamba is a framework that combines Transformer and Mamba models for efficient long-sequence processing.TransMamba uses shared parameter matrices to switch between attention and state space model (SSM) mechanisms.The framework includes a Memory converter to bridge Transformer and Mamba models for seamless information flow.Experimental results demonstrate that TransMamba achieves superior training efficiency and performance compared to baselines.