autoformer 非周期性的数据
  TnD0WQEygW8e 2023年11月02日 47 0

https://openreview.net/forum?id=J4gRj6d5Qm

 

Q1: For time series without clear periodicity, whether Auto-Correlation still works or not?

In Table 3 of main paper―, we have shown that Auto-Correlation outperforms the self-attention family on the ETT dataset (with clear periodicity), by replacing the Auto-Correlation in Autoformer with different self-attention modules. As per your suggestion, we repeat the experiment on the Exchange dataset (without clear periodicity). Here are more results:

Exchange (without clear periodicity) input-96-predict-336

MSE

MAE

Auto-Correlation + Autoformer architecture (deep decomposition)

0.488

0.510

Full Self-Attention + Autoformer architecture (deep decomposition)

0.632

0.584

LogSparse Attention + Autoformer architecture (deep decomposition)

0.569

0.592

LSH Attention + Autoformer architecture (deep decomposition)

0.553

0.549

PropSparse Attention + Autoformer architecture (deep decomposition)

0.958

0.729

As shown in the above table, for series without clear periodicity, Auto-Correlation still outperforms other self-attention mechnisms in both MSE and MAE. Therefore, Auto-correlation is a robust deep learning module for general-pattern series data.

We have also provided some visualization results on the Exchange dataset (without clear periodicity) in Section 4.2 of supplementary material―, which shows that Autoformer can make meaningful long-term forecasting for series without clear periodicity. It is notable that both the Auto-Correlation and the decomposition architecure contribute substantially to Autoformer's predicative power for complex series.

 

 

Q3: More baselines for comparison, including ARIMA, DeepGLO, and N-BEATS.

We have provided in Table 4 of supplementary material― the results of ARMIA, as well as the well-known deep learning model (DeepAR) and a strong statistical model (Prophet, open-sourced by FaceBook). Here are more results of Autoformer comparing with ARIMA, DeepGLO, and N-BEATS.

(1) Results of input-96-predict-O under the univariate setting.

ETT (MSE | MAE)

Predict 96

Predict 192

Predict 336

Predict 720

ARIMA

0.568 | 0.572

0.804 | 0.720

1.438 | 1.010

3.291 | 1.569

DeepGLO

0.199 | 0.341

0.223 | 0.360

0.245 | 0.400

0.328 | 0.462

N-BEATS

0.257 | 0.389

0.298 | 0.424

0.320 | 0.445

0.363 | 0.480

Autoformer

0.065 | 0.189

0.110 | 0.258

0.145 | 0.295

0.182 | 0.335

Exchange (MSE |MAE)

Predict 96

Predict 192

Predict 336

Predict 720

ARIMA

0.308 | 0.396

1.305 | 1.178

1.762|1.445

5.017 | 1.893

DeepGLO

0.850 | 0.786

1.825 | 1.185

2.210 | 1.330

5.818 | 2.232

N-BEATS

0.319 | 0.433

0.706 | 0.651

1.282 | 0.879

2.757 |1.341

Autoformer

0.126 | 0.268

0.530 | 0.565

0.586 | 0.572

1.838 | 1.201

(2) Results of input-96-predict-O under the multivariate setting. For ARIMA, N-BEATS (univariate predictive model), we predict the multivariate series dimension by dimension.

ETT (MSE |MAE)

Predict 96

Predict 192

Predict 336

Predict 720

ARIMA

0.267 | 0.382

2.414 | 0.588

10.083 | 0.896

15.338 | 1.183

DeepGLO

0.288 | 0.395

0.510 | 0.551

0.872 | 0.734

2.173 | 1.208

N-BEATS

0.313 | 0.395

0.392 | 0.440

0.464 | 0.473

0.571 | 0.521

Autoformer

0.194 | 0.284

0.261 | 0.323

0.351 | 0.384

0.491 | 0.470

Exchange (MSE |MAE)

Predict 96

Predict 192

Predict 336

Predict 720

ARIMA

0.327| 0.417

0.656 | 0.568

0.970 | 0.572

4.808 | 1.182

DeepGLO

0.928 | 0.751

1.142 | 0.861

1.512 | 1.013

1.542 | 1.097

N-BEATS

0.316 | 0.409

0.328 | 0.444

1.203 | 0.819

1.672 | 1.013

Autoformer

0.134 | 0.270

0.272 | 0.374

0.488 | 0.510

1.367 | 0.901

We will make Table 1 of main paper― more complete by adding the above results for all datasets and forecasting horizons.

 



【版权声明】本文内容来自摩杜云社区用户原创、第三方投稿、转载,内容版权归原作者所有。本网站的目的在于传递更多信息,不拥有版权,亦不承担相应法律责任。如果您发现本社区中有涉嫌抄袭的内容,欢迎发送邮件进行举报,并提供相关证据,一经查实,本社区将立刻删除涉嫌侵权内容,举报邮箱: cloudbbs@moduyun.com

  1. 分享:
最后一次编辑于 2023年11月08日 0

暂无评论

TnD0WQEygW8e