MERRIC인
CROSSFORMER: TRANSFORMER UTILIZING CROSSDIMENSION DEPENDENCY FOR MULTIVARIATE TIME SERIES FORECASTING
Yunhao Zhang(TRANSFORMER UTILIZING CROSS)
china | ICLR 2023

■ View full text 

Published as a conference paper at ICLR 2023

https://openreview.net/pdf?id=vSVLM2j9eie

 

■ Researchers

Yunhao Zhang

Shanghai Jiao Tong University and Shanghai AI Lab

 

■ Abstract

Recently many deep models have been proposed for multivariate time series (MTS) forecasting. In particular, Transformer-based models have shown great potential because they can capture long-term dependency. However, existing Transformerbased models mainly focus on modeling the temporal dependency (cross-time dependency) yet often omit the dependency among different variables (crossdimension dependency), which is critical for MTS forecasting. To fill the gap, we propose Crossformer, a Transformer-based model utilizing cross-dimension dependency for MTS forecasting. In Crossformer, the input MTS is embedded into a 2D vector array through the Dimension-Segment-Wise (DSW) embedding to preserve time and dimension information. Then the Two-Stage Attention (TSA) layer is proposed to efficiently capture the cross-time and cross-dimension dependency. Utilizing DSW embedding and TSA layer, Crossformer establishes a Hierarchical Encoder-Decoder (HED) to use the information at different scales for the final forecasting. Extensive experimental results on six real-world datasets show the effectiveness of Crossformer against previous state-of-the-arts.

 

 

인쇄 Facebook Twitter 스크랩

  전체댓글 0

[로그인]

댓글 입력란
사용자 프로필 이미지
0/500자