China's DeepSeek AI has launched DeepEP, a communication library for mixture of expert (MoE) model training and inference.The library aims to improve communication between GPUs and machine learning models using MoE architecture.DeepEP offers optimized kernels for efficient data movement and achieves high performance on NVIDIA H800 GPUs and InfiniBand RDMA network cards.DeepSeek plans to open-source five repositories, including DeepEP, as part of its commitment to transparency and openness.