Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

PyTorch code for ETSformer: Exponential Smoothing Transformers for Time-series Forecasting

License

NotificationsYou must be signed in to change notification settings

salesforce/ETSformer



Figure 1. Overall ETSformer Architecture.

Official PyTorch code repository for theETSformer paper. Check out ourblog post!

  • ETSformer is a novel time-series Transformer architecture which exploits the principle of exponential smoothing in improvingTransformers for timeseries forecasting.
  • ETSformer is inspired by the classical exponential smoothing methods intime-series forecasting, leveraging the novel exponential smoothing attention (ESA) and frequency attention (FA) toreplace the self-attention mechanism in vanilla Transformers, thus improving both accuracy and efficiency.

Requirements

  1. Install Python 3.8, and the required dependencies.
  2. Required dependencies can be installed by:pip install -r requirements.txt

Data

  • Pre-processed datasets can be downloaded from the followinglinks,Tsinghua CloudorGoogle Drive, as obtainedfromAutoformer's GitHub repository.
  • Place the downloaded datasets into thedataset/ folder, e.g.dataset/ETT-small/ETTm2.csv.

Usage

  1. Install the required dependencies.
  2. Download data as above, and place them in the folder,dataset/.
  3. Train the model. We provide the experiment scripts of all benchmarks under the folder./scripts,e.g../scripts/ETTm2.sh. You might have to change permissions on the script files by runningchmod u+x scripts/*.
  4. The script for grid search is also provided, and can be run by./grid_search.sh.

Acknowledgements

The implementation of ETSformer relies on resources from the following codebases and repositories, we thank the originalauthors for open-sourcing their work.

Citation

Please consider citing if you find this code useful to your research.

@article{woo2022etsformer,    title={ETSformer: Exponential Smoothing Transformers for Time-series Forecasting},    author={Gerald Woo and Chenghao Liu and Doyen Sahoo and Akshat Kumar and Steven C. H. Hoi},    year={2022},    url={https://arxiv.org/abs/2202.01381},}

Releases

No releases published

Packages

No packages published

[8]ページ先頭

©2009-2025 Movatter.jp