Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Official implementation of Earthformer

License

NotificationsYou must be signed in to change notification settings

amazon-science/earth-forecasting-transformer

PWCPWCPWC

ByZhihan Gao,Xingjian Shi,Hao Wang,Yi Zhu,Yuyang Wang,Mu Li,Dit-Yan Yeung.

This repo is the official implementation of"Earthformer: Exploring Space-Time Transformers for Earth System Forecasting" that will appear in NeurIPS 2022.

Check ourposter.

Tutorials

Introduction

Conventionally, Earth system (e.g., weather and climate) forecasting relies on numerical simulation with complex physical models and are hence bothexpensive in computation and demanding on domain expertise. With the explosive growth of the spatiotemporal Earth observation data in the past decade,data-driven models that apply Deep Learning (DL) are demonstrating impressive potential for various Earth system forecasting tasks.The Transformer as an emerging DL architecture, despite its broad success in other domains, has limited adoption in this area.In this paper, we proposeEarthformer, a space-time Transformer for Earth system forecasting.Earthformer is based on a generic, flexible and efficient space-time attention block, namedCuboid Attention.The idea is to decompose the data into cuboids and apply cuboid-level self-attention in parallel. These cuboids are further connected with a collection of global vectors.

Earthformer achieves strong results in synthetic datasets like MovingMNIST and N-body MNIST dataset, and also outperforms non-Transformer models (like ConvLSTM, CNN-U-Net) in SEVIR (precipitation nowcasting) and ICAR-ENSO2021 (El Nino/Southern Oscillation forecasting).

teaser

Cuboid Attention Illustration

cuboid_attention_illustration

Installation

We recommend managing the environment through Anaconda.

First, find out where CUDA is installed on your machine. It is usually under/usr/local/cuda or/opt/cuda.

Next, check which version of CUDA you have installed on your machine:

nvcc --version

Then, create a new conda environment:

conda create -n earthformer python=3.9conda activate earthformer

Lastly, install dependencies. For example, if you have CUDA 11.6 installed under/usr/local/cuda, run:

python3 -m pip install torch==1.12.1+cu116 torchvision==0.13.1+cu116 -f https://download.pytorch.org/whl/torch_stable.htmlpython3 -m pip install pytorch_lightning==1.6.4python3 -m pip install xarray netcdf4 opencv-python earthnet==0.3.9cd ROOT_DIR/earth-forecasting-transformerpython3 -m pip install -U -e. --no-build-isolation# Install ApexCUDA_HOME=/usr/local/cuda python3 -m pip install -v --no-cache-dir --global-option="--cpp_ext" --global-option="--cuda_ext" pytorch-extension git+https://github.com/NVIDIA/apex.git

If you have CUDA 11.7 installed under/opt/cuda, run:

python3 -m pip install torch==1.13.1+cu117 torchvision==0.14.1+cu117 -f https://download.pytorch.org/whl/torch_stable.htmlpython3 -m pip install pytorch_lightning==1.6.4python3 -m pip install xarray netcdf4 opencv-python earthnet==0.3.9cd ROOT_DIR/earth-forecasting-transformerpython3 -m pip install -U -e. --no-build-isolation# Install ApexCUDA_HOME=/opt/cuda python3 -m pip install -v --no-cache-dir --global-option="--cpp_ext" --global-option="--cuda_ext" pytorch-extension git+https://github.com/NVIDIA/apex.git

Dataset

MovingMNIST

We followUnsupervised Learning of Video Representations using LSTMs (ICML2015) to useMovingMNIST that contains 10,000 sequences each of length 20 showing 2 digits moving in a$64\times 64$ frame.

OurMovingMNIST DataModule automatically downloads it todatasets/moving_mnist.

N-body MNIST

The underlying dynamics in the N-body MNIST dataset is governed by the Newton's law of universal gravitation:

$\frac{d^2\boldsymbol{x}_{i}}{dt^2} = - \sum_{j\neq i}\frac{G m_j (\boldsymbol{x}_{i}-\boldsymbol{x}_{j})}{(|\boldsymbol{x}_i-\boldsymbol{x}_j|+d_{\text{soft}})^r}$

where$\boldsymbol{x}_{i}$ is the spatial coordinates of the$i$-th digit,$G$ is the gravitational constant,$m_j$ is the mass of the$j$-th digit,$r$ is a constant representing the power scale in the gravitational law,$d_{\text{soft}}$ is a small softening distance that ensures numerical stability.

The N-body MNIST dataset we used in the paper can be downloaded fromhttps://earthformer.s3.amazonaws.com/nbody/nbody_paper.zip .

In addition, you can also use the following script for downloading / extracting the data:

cd ROOT_DIR/earth-forecasting-transformerpython ./scripts/datasets/nbody/download_nbody_paper.py

Alternatively, run the following commands to generate N-body MNIST dataset.

cd ROOT_DIR/earth-forecasting-transformerpython ./scripts/datasets/nbody/generate_nbody_dataset.py --cfg ./scripts/datasets/nbody/cfg.yaml

SEVIR

Storm EVent ImageRy (SEVIR) dataset is a spatiotemporally aligned dataset containing over 10,000 weather events.We adopt NEXRAD Vertically Integrated Liquid (VIL) mosaics in SEVIR for benchmarking precipitation nowcasting, i.e., to predict the future VIL up to 60 minutes given 65 minutes context VIL.The resolution is thus$13\times 384\times 384\rightarrow 12\times 384\times 384$.

To download SEVIR dataset from AWS S3, run:

cd ROOT_DIR/earth-forecasting-transformerpython ./scripts/datasets/sevir/download_sevir.py --dataset sevir

A visualization example of SEVIR VIL sequence:Example_SEVIR_VIL_sequence

ICAR-ENSO

ICAR-ENSO consists of historical climate observation and stimulation data provided by Institute for Climate and Application Research (ICAR).We forecast the SST anomalies up to 14 steps (2 steps more than one year for calculating three-month-moving-average),given a context of 12 steps (one year) of SST anomalies observations.

To download the dataset, you need to follow the instructions on theofficial website.You can download a zip-file namedenso_round1_train_20210201.zip. Put it under./datasets/ and extract the zip file with the following command:

unzip datasets/enso_round1_train_20210201.zip -d datasets/icar_enso_2021

EarthNet2021

You may follow theofficial instructions for downloadingEarthNet2021 dataset.We recommend download it via theearthnet_toolket.

importearthnetasenen.download(dataset="earthnet2021",splits="all",save_directory="./datasets/earthnet2021")

Alternatively, you may downloadEarthNet2021x dataset, which is the same asEarthNet2021 dataset except for the file format (.npz for EarthNet2021 and.nc for EarthNet2021x).

importearthnetasenen.download(dataset="earthnet2021x",splits="all",save_directory="./datasets/earthnet2021x")

It requires 455G disk space in total.

Earthformer Training

Find detailed instructions in the corresponding training script folder

Training Script and Pretrained Models

Find detailed instructions in how to train the models or running inference with our pretrained models in the corresponding script folder.

DatasetScript FolderPretrained WeightsConfig
SEVIRscriptslinkconfig
ICAR-ENSOscriptslinkconfig
EarthNet2021scriptslinkconfig
N-body MNISTscripts--

Citing Earthformer

@inproceedings{gao2022earthformer,  title={Earthformer: Exploring Space-Time Transformers for Earth System Forecasting},  author={Gao, Zhihan and Shi, Xingjian and Wang, Hao and Zhu, Yi and Wang, Yuyang and Li, Mu and Yeung, Dit-Yan},  booktitle={NeurIPS},  year={2022}}

Security

SeeCONTRIBUTING for more information.

Credits

Third-party libraries:

License

This project is licensed under the Apache-2.0 License.

About

Official implementation of Earthformer

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors7


[8]ページ先頭

©2009-2025 Movatter.jp