menu
techminis

A naukri.com initiative

google-web-stories
Home

>

Data Science News

>

DeepSeek L...
source image

Analyticsindiamag

1M

read

150

img
dot

Image Credit: Analyticsindiamag

DeepSeek Launches DeepEP, a Communication library for Mixture of Experts Model Training and Inference

  • China's DeepSeek AI has launched DeepEP, a communication library for mixture of expert (MoE) model training and inference.
  • The library aims to improve communication between GPUs and machine learning models using MoE architecture.
  • DeepEP offers optimized kernels for efficient data movement and achieves high performance on NVIDIA H800 GPUs and InfiniBand RDMA network cards.
  • DeepSeek plans to open-source five repositories, including DeepEP, as part of its commitment to transparency and openness.

Read Full Article

like

9 Likes

For uninterrupted reading, download the app