Poster
in
Workshop: Advances in Financial AI: Opportunities, Innovations, and Responsible AI
Continual Domain Adaptation in Time Series via Parameter-Efficient Dual Adapters and Prompt Tuning
Ashish Mishra · Suparna Bhattacharya · Martin Foltin
Time series data are vital in a variety of real-world fields—such as finance, healthcare, industrial monitoring, and environmental forecasting—where the statistical distributions can change over time. Often, these time series are gathered from multiple sources or environments (domains) that exhibit different statistical characteristics. This discrepancy, referred to as domain shift, creates significant challenges for machine learning models that are generally built on the assumption of stationary data. In particular, models fine-tuned for a specific domain tend to perform poorly when deployed in new domains, an issue that becomes even more pronounced when new domains are encountered sequentially. Consequently, there is a pressing and complex need for models that can quickly adapt to new domains while still retaining the knowledge acquired from previous ones.To address this, we propose a continual domain adaptation framework for time series data, utilizing a frozen Transformer-based backbone enhanced with domain-specific prompt tokens and residual adapters. The model is initially trained on Domain~0, and for each new domain, only a small subset of domain-specific parameters is updated. Our framework is applied sequentially across multiple domains, addressing both time series forecasting and classification tasks for univariate and multivariate time series data. To demonstrate its effectiveness, we empirically evaluate the model on challenging synthesized time series datasets, covering both univariate and multivariate cases. The results highlight that our approach effectively adapts to new domains while preserving knowledge from previously trained domains, making it well-suited for continual domain adaptation in forecasting and classification.