Towards Generalizable Time Series Forecasting Via IB-Regularized Transformer-Diffusion
Citations

WEB OF SCIENCE

0
Citations

SCOPUS

0

초록

Multivariate Time Series Forecasting (MTSF) remains challenging due to the need to capture complex temporal dependencies and ensure robustness against distribution shift. While Transformer-based models excel at capturing long-range patterns, and diffusion models offer generative flexibility, directly applying diffusion often leads to excessive uncertainty. To address these limitations, we propose a unified forecasting framework (IB-TransDiff) that combines global modeling based on Transformers and local refinement based on diffusion, guided by the Information Bottleneck (IB) principle. To improve generalization, we introduce a novel regularization strategy that penalizes the second moment of the conditional embedding to reduce its mutual information with the input. Experiments on multiple benchmarks demonstrate that our method consistently outperforms existing approaches in MSE and MAE. In addition, our method reduces uncertainty, achieving notable improvements in QICE, CRPS, and PICP metrics. These results highlight the value of integrating information-theoretic constraints into conditional diffusion models for robust and accurate time series forecasting.

키워드

Information BottleneckTime Series Forecasting
제목
Towards Generalizable Time Series Forecasting Via IB-Regularized Transformer-Diffusion
저자
Na, DagyeongKang, JinhoKwon, Junseok
DOI
10.1109/ICDM65498.2025.00153
발행일
2025
유형
Conference Paper
저널명
Proceedings - IEEE International Conference on Data Mining, ICDM
페이지
1435 ~ 1444