WaveBound: dynamic error bounds for stable time series forecasting

Y Cho, D Kim, D Kim, MA Khan… - Advances in Neural …, 2022 - proceedings.neurips.cc
Advances in Neural Information Processing Systems, 2022proceedings.neurips.cc
Time series forecasting has become a critical task due to its high practicality in real-world
applications such as traffic, energy consumption, economics and finance, and disease
analysis. Recent deep-learning-based approaches have shown remarkable success in time
series forecasting. Nonetheless, due to the dynamics of time series data, deep networks still
suffer from unstable training and overfitting. Inconsistent patterns appearing in real-world
data lead the model to be biased to a particular pattern, thus limiting the generalization. In …
Abstract
Time series forecasting has become a critical task due to its high practicality in real-world applications such as traffic, energy consumption, economics and finance, and disease analysis. Recent deep-learning-based approaches have shown remarkable success in time series forecasting. Nonetheless, due to the dynamics of time series data, deep networks still suffer from unstable training and overfitting. Inconsistent patterns appearing in real-world data lead the model to be biased to a particular pattern, thus limiting the generalization. In this work, we introduce the dynamic error bounds on training loss to address the overfitting issue in time series forecasting. Consequently, we propose a regularization method called WaveBound which estimates the adequate error bounds of training loss for each time step and feature at each iteration. By allowing the model to focus less on unpredictable data, WaveBound stabilizes the training process, thus significantly improving generalization. With the extensive experiments, we show that WaveBound consistently improves upon the existing models in large margins, including the state-of-the-art model.
proceedings.neurips.cc
以上显示的是最相近的搜索结果。 查看全部搜索结果