Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

Time-series forecasting with 1D Conv model, RNN (LSTM) model and Transformer model. Comparison of long-term and short-term forecasts using synthetic timeseries. Sequence-to-sequence formulation.

License

NotificationsYou must be signed in to change notification settings

rsyamil/timeseries-rnn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This is an example of how to use a 1D convolutional neural network (1D-CNN) and a recurrent neural network (RNN) with long-short-term memory (LSTM) cell for one-step and multi-step timeseries prediction/forecasting. To run:

python3 <demo-cnn.py|demo-rnn.py>

The dataset we will use is a simple hyperbolic curve (timeseries) with added Gaussian noise. Technically, the timeseries can be modeled with just three parameters in a hyperbolic function but we use a simple curve just as a demonstration. We split the timeseries into chunks of input-output window to frame the problem as a supervised machine learning problem. We want to predict a time windowy_w+1 of lengthn_seq using the past information windowy_w of lengthn_lag.

dataset

The windows (i.e. the rows in the table above) now represent our dataset and we split the dataset into a training set and a testing set. We want to learnf, which is a time-invariant predictive model that relatesy_w toy_w+1. In this example, we compare a 1D-CNN and an RNN asf.

1D-CNN forecast model

The 1D-CNN model has one-dimensional convolution filters that stride the timeseries to extract temporal features. A couple of layers is used to handle some nonlinearities in the data and the simple 1D-CNN model only has 942 parameters.

cnn1d_arch

The figure below shows the original timeseries in light-gray scatter points. The training and testing data points (i.e.y_w+1 only) are shown as red and blue scatter points respectively. The red and blue lines are the forecasts from the 1D-CNN model. The green line represents the multi-step prediction, where previous forecast are fed into the 1D-CNN model in a recursive way.

cnn1d_forecasts

RNN (LSTM) forecast model

For the RNN model, we will use an LSTM cell to extract the temporal features, followed by a Dense layer to reshape the LSTM output tensor into the appropriate output size, of lengthn_seq.

rnn_arch

The RNN predictive model has only 546 parameters where 480 parameters belong to the single LSTM cell as shown below.

params_nodim

Note that the single LSTM cell stride the inputy_w of lengthn_lag one point at a time to produce an output of length 10 (in this example). Ifreturn_sequences=True then the output of each stride will be returned, i.e. instead of output of length 10, the output will take a shape of (6, 10) - one output of length 10 for 6 strides across the entire length ofn_lag. This will be important later for other applications.

rnn_forecasts

The forecasts are shown above and the legends are the same as the 1D-CNN plot in the previous section.

comp_cnn1d_rnn

In the plots above, we compare the multi-step prediction from the 1D-CNN and RNN models. The single-window forecasts (i.e. use observedy_w to predicty_w+1) for the training and testing sets are similar for the two models. The RNN model however outperforms the 1D-CNN model for multi-step recursive forecasts.

About

Time-series forecasting with 1D Conv model, RNN (LSTM) model and Transformer model. Comparison of long-term and short-term forecasts using synthetic timeseries. Sequence-to-sequence formulation.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

[8]ページ先頭

©2009-2025 Movatter.jp