Universal forecasting is challenging due to diverse time series data nature.Transformer architecture leads to Large Time Series Models (LTSM) supporting Zero-Shot forecasts for different time series behaviors.Foundation models for Zero-Shot forecasting include PatchTST, DLinear, FEDformer, and Moirai-MoE by Salesforce AI Research.Moirai-MoE combines Moirai and Mixture of model architectures for advanced forecasting capabilities.