This is an official PyTorch implementation of the NeurIPS 2023 paper 《OneNet: Enhancing Time Series Forecasting Models under Concept Drift by Online Ensembling》
This codebase is the official implementation of OneNet: Enhancing Time Series Forecasting Models under Concept Drift by Online Ensembling
(NeurIPS 2023 poster)
This codebase is mainly based on FSNet.
Online updating of time series forecasting models aims to address the concept drifting problem by efficiently updating forecasting models based on streaming data. Many algorithms are designed for online time series forecasting, with some exploiting cross-variable dependency while others assume independence among variables. Given every data assumption has its own pros and cons in online time series modeling, we propose Online ensembling Network (OneNet). It dynamically updates and combines two models, with one focusing on modeling the dependency across the time dimension and the other on cross-variate dependency. Our method incorporates a reinforcement learning-based approach into the traditional online convex programming framework, allowing for the linear combination of the two models with dynamically adjusted weights. OneNet addresses the main shortcomings of classical online learning methods that tend to be slow in adapting to the concept drift. Empirical results show that OneNet reduces online forecasting error by more than $50$% compared to the State-Of-The-Art (SOTA) method.
We follow the same data formatting as the Informer repo (https://github.com/zhouhaoyi/Informer2020), which also hosts the raw data.
Please put all raw data (csv) files in the ./data
folder.
To replicate our results on the ETT, ECL, Traffic, and WTH datasets, run
sh run.sh
You can specify one of the above method via the --method
argument.
Dataset: Our implementation currently supports the following datasets: Electricity Transformer - ETT (including ETTh1, ETTh2, ETTm1, and ETTm2), ECL, Traffic, and WTH. You can specify the dataset via the --data
argument.
Other arguments: Other useful arguments for experiments are:
--test_bsz
: batch size used for testing: must be set to 1 for online learning,--seq_len
: look-back windows' length, set to 60 by default,--pred_len
: forecast windows' length, set to 1 for online learning.Backbones: Our implementation supports the following backbones in Table.1:
Ablations: Our online learning and ensembling ablation baselines in Table.4:
Algorithms: Our implementation supports the following training strategies in Table.2,3:
This source code is released under the MIT license, included here.
If you find this repo useful, please consider citing:
@misc{zhang2023onenet,
title={OneNet: Enhancing Time Series Forecasting Models under Concept Drift by Online Ensembling},
author={Yi-Fan Zhang and Qingsong Wen and Xue Wang and Weiqi Chen and Liang Sun and Zhang Zhang and Liang Wang and Rong Jin and Tieniu Tan},
year={2023},
eprint={2309.12659},
archivePrefix={arXiv},
primaryClass={cs.LG}
}