Adapt Data to Model: Adaptive Transformation Optimization for Domain-shared Time Series Foundation Models
Abstract
Large time series models (LTMs) have recently demonstrated powerful capabilities for universal forecasting. However, these models still struggle to address the variety and nonstationarity of time series, resulting in an unsatisfying balance between forecasting performance and generalizability. Instead of breeding unceasingly new models for diverse domains, this paper proposes a novel framework, time-series adaptive transformation optimization (TATO), that enables a frozen pre-trained LTM to fit various downstream domains through an empirically optimal time-series transformation pipeline. Three representative types of time series transformations, including context slicing, scale normalization, and outlier correction, are constructed to help LTMs fit the target domain. A two-stage ranking is also designed to ensure the optimization's robustness by filtering out transformation pipelines with shortcomings in specific metrics. Extensive evaluations on state-of-the-art pre-trained LTMs and widely used datasets demonstrate that TATO performs universally well and significantly enhances domain-adaptive forecasting performance, achieving a maximum MSE reduction of 68.4% and an average decrease of 16.0%. In most cases, the time required by TATO to optimize time series transformation pipelines is under 2 minutes, making it practical for real-world applications. We have published the source code of TATO at https://anonymous.4open.science/r/TATO-D55C.