A novel cross-domain knowledge transfer framework is proposed to enhance the performance of Large Language Models (LLMs) in time series forecasting.
The approach systematically infuses LLMs with structured temporal information to improve their forecasting accuracy in fields like energy systems, finance, and healthcare.
Results from evaluating the proposed method on a real-world time series dataset show that knowledge-informed forecasting significantly outperforms a naive baseline with no auxiliary information.
These findings demonstrate the potential of knowledge transfer strategies to improve the predictive accuracy and generalization of LLMs in domain-specific forecasting tasks.