Training deep neural networks typically involves substantial computational costs during both forward and backward propagation.Dropping Backward Propagation (DropBP) is a novel approach designed to reduce computational costs while maintaining accuracy.DropBP randomly drops layers during backward propagation without affecting forward propagation.Utilizing DropBP can reduce training time, increase convergence speed, and enable training with a larger sequence length.