- Notifications
You must be signed in to change notification settings - Fork45
PyTorch code for ETSformer: Exponential Smoothing Transformers for Time-series Forecasting
License
NotificationsYou must be signed in to change notification settings
salesforce/ETSformer
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation

Figure 1. Overall ETSformer Architecture.
Official PyTorch code repository for theETSformer paper. Check out ourblog post!
- ETSformer is a novel time-series Transformer architecture which exploits the principle of exponential smoothing in improvingTransformers for timeseries forecasting.
- ETSformer is inspired by the classical exponential smoothing methods intime-series forecasting, leveraging the novel exponential smoothing attention (ESA) and frequency attention (FA) toreplace the self-attention mechanism in vanilla Transformers, thus improving both accuracy and efficiency.
- Install Python 3.8, and the required dependencies.
- Required dependencies can be installed by:
pip install -r requirements.txt
- Pre-processed datasets can be downloaded from the followinglinks,Tsinghua CloudorGoogle Drive, as obtainedfromAutoformer's GitHub repository.
- Place the downloaded datasets into the
dataset/folder, e.g.dataset/ETT-small/ETTm2.csv.
- Install the required dependencies.
- Download data as above, and place them in the folder,
dataset/. - Train the model. We provide the experiment scripts of all benchmarks under the folder
./scripts,e.g../scripts/ETTm2.sh. You might have to change permissions on the script files by runningchmod u+x scripts/*. - The script for grid search is also provided, and can be run by
./grid_search.sh.
The implementation of ETSformer relies on resources from the following codebases and repositories, we thank the originalauthors for open-sourcing their work.
Please consider citing if you find this code useful to your research.
@article{woo2022etsformer, title={ETSformer: Exponential Smoothing Transformers for Time-series Forecasting}, author={Gerald Woo and Chenghao Liu and Doyen Sahoo and Akshat Kumar and Steven C. H. Hoi}, year={2022}, url={https://arxiv.org/abs/2202.01381},}About
PyTorch code for ETSformer: Exponential Smoothing Transformers for Time-series Forecasting
Topics
Resources
License
Code of conduct
Security policy
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
No releases published
Packages0
No packages published
Uh oh!
There was an error while loading.Please reload this page.