menu
techminis

A naukri.com initiative

google-web-stories
Home

>

Data Science News

>

Baidu’s ER...
source image

Analyticsindiamag

1w

read

282

img
dot

Image Credit: Analyticsindiamag

Baidu’s ERNIE 4.5 is Built On a ‘Heterogeneous MoE’ Architecture

  • Baidu has open-sourced the ERNIE 4.5 models, including language and multimodal AI variants, under the Apache 2.0 license.
  • ERNIE-4.5 variants outperform competitors like DeepSeek-V3 671B and Alibaba's Qwen3-30B-A3B on benchmarks.
  • The core innovation of ERNIE 4.5 models is the heterogeneous modality Mixture of Experts (MoE) structure for multimodal learning.
  • Through extreme optimizations, the largest ERNIE 4.5 model achieved high FLOPs utilization on NVIDIA H800 GPUs using advanced training techniques.

Read Full Article

like

16 Likes

For uninterrupted reading, download the app