Stochastic bilevel optimization (SBO) is increasingly important in machine learning and nested structures.Decentralized approaches, like D-SOBA, improve communication efficiency and algorithmic robustness.D-SOBA framework has two variants: D-SOBA-SO with second-order matrices, and D-SOBA-FO with first-order gradients.Comprehensive non-asymptotic convergence analysis of D-SOBA reveals the impact of network topology and data heterogeneity.