Salesforce AI Research has launched Moirai-MoE, the first mixture-of-experts time series foundation model.The model achieves token-level specialization and outperforms its predecessor, Moirai, by 17%.Moirai-MoE performs better than competitors in testing and surpasses larger models by 8% to 17% in different metrics.The model simplifies time series forecasting by using sparse mixture-of-experts transformers and a decoder-only training method.