This paper introduces a study on Discretized Neural Networks (DNNs) composed of low-precision weights and activations.The use of Straight-Through Estimator (STE) to approximate gradients for training-based DNNs introduces gradient mismatch.The paper proposes addressing the gradient mismatch as a metric perturbation in a Riemannian manifold through the lens of duality theory.Experimental results demonstrate the superior and stable performance of the proposed method for DNNs compared to other training-based methods.