Time Series Foundation Models (TSFMs) struggle to outperform smaller, specialized models even after fine-tuning on specific data.
Empirical studies show that TSFMs have inherent sparsity and redundancy, indicating adaptability to diverse tasks.
A structured pruning method is proposed to enhance adaptation of TSFMs by focusing on relevant network substructures during fine-tuning.
Experiments on seven TSFMs and six benchmarks reveal that pruning and then fine-tuning these models leads to improved forecasting performance, surpassing specialized baselines.