Negative distance kernels are widely used in maximum mean discrepancies (MMDs) and have shown favorable numerical results in various applications.However, due to its non-smoothness in x=y, classical theoretical results do not hold true.A new Lipschitz differentiable kernel is proposed that maintains the favorable properties of the negative distance kernel.The new kernel performs similarly well as the negative distance kernel in gradient descent methods, but now with theoretical guarantees.