- Notifications
You must be signed in to change notification settings - Fork681
Time series Timeseries Deep Learning Machine Learning Python Pytorch fastai | State-of-the-art Deep Learning library for Time Series and Sequences in Pytorch / fastai
License
timeseriesAI/tsai
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
State-of-the-art Deep Learning library for Time Series and Sequences.
tsai
is an open-source deep learning package built on top of Pytorch &fastai focused on state-of-the-art techniques for time series tasks likeclassification, regression, forecasting, imputation…
tsai
is currently under active development by timeseriesAI.
During the last few releases, here are some of the most significantadditions totsai
:
- New models: PatchTST (Accepted by ICLR 2023), RNN with Attention(RNNAttention, LSTMAttention, GRUAttention), TabFusionTransformer, …
- New datasets: we have increased the number of datasets you candownload using
tsai
:- 128 univariate classification datasets
- 30 multivariate classification datasets
- 15 regression datasets
- 62 forecasting datasets
- 9 long term forecasting datasets
- New tutorials:PatchTST.Based on some of your requests, we are planning to release additionaltutorials on data preparation and forecasting.
- New functionality: sklearn-type pipeline transforms, walk-fowardcross validation, reduced RAM requirements, and a lot of newfunctionality to perform more accurate time series forecasts.
- Pytorch 2.0 support.
You can install thelatest stable version from pip using:
pipinstalltsai
If you plan to develop tsai yourself, or want to be on the cutting edge,you can use an editable install. First install PyTorch, and then:
gitclonehttps://github.com/timeseriesAI/tsaipipinstall-e"tsai[dev]"
Note: starting with tsai 0.3.0 tsai will only install hard dependencies.Other soft dependencies (which are only required for selected tasks)will not be installed by default (this is the recommended approach. Ifyou require any of the dependencies that is not installed, tsai will askyou to install it when necessary). If you still want to install tsaiwith all its dependencies you can do it by running:
pipinstalltsai[extras]
You can also install tsai using conda (note that if you replace condawith mamba the install process will be much faster and more reliable):
condainstall-ctimeseriesaitsai
Here’s the link to thedocumentation.
Here’s a list with some of the state-of-the-art models available intsai
:
- LSTM(Hochreiter, 1997)(paper)
- GRU(Cho, 2014) (paper)
- MLP -Multilayer Perceptron (Wang, 2016)(paper)
- FCN -Fully Convolutional Network (Wang, 2016)(paper)
- ResNet -Residual Network (Wang, 2016)(paper)
- LSTM-FCN(Karim, 2017) (paper)
- GRU-FCN(Elsayed, 2018) (paper)
- mWDN -Multilevel wavelet decomposition network (Wang, 2018)(paper)
- TCN -Temporal Convolutional Network (Bai, 2018)(paper)
- MLSTM-FCN -Multivariate LSTM-FCN (Karim, 2019)(paper)
- InceptionTime(Fawaz, 2019) (paper)
- Rocket(Dempster, 2019) (paper)
- XceptionTime(Rahimian, 2019) (paper)
- ResCNN -1D-ResCNN (Zou , 2019)(paper)
- TabModel -modified from fastai’sTabularModel
- OmniScale -Omni-Scale 1D-CNN (Tang, 2020)(paper)
- TST -Time Series Transformer (Zerveas, 2020)(paper)
- TabTransformer(Huang, 2020) (paper)
- TSiTAdapted from ViT (Dosovitskiy, 2020)(paper)
- MiniRocket(Dempster, 2021) (paper)
- XCM -An Explainable Convolutional Neural Network (Fauvel, 2021)(paper)
- gMLP -Gated Multilayer Perceptron (Liu, 2021)(paper)
- TSPerceiver -Adapted from Perceiver IO (Jaegle, 2021)(paper)
- GatedTabTransformer(Cholakov, 2022) (paper)
- TSSequencerPlus -Adapted from Sequencer (Tatsunami, 2022)(paper)
- PatchTST -(Nie, 2022) (paper)
plus other custom models like: TransformerModel, LSTMAttention,GRUAttention, …
To get to know the tsai package, we’d suggest you start with thisnotebook in Google Colab:01_Intro_to_Time_Series_ClassificationIt provides an overview of a time series classification task.
We have also develop many othertutorialnotebooks.
To use tsai in your own notebooks, the only thing you need to do afteryou have installed the package is to run this:
fromtsai.allimport*
These are just a few examples of how you can usetsai
:
Training:
fromtsai.basicsimport*X,y,splits=get_classification_data('ECG200',split_data=False)tfms= [None,TSClassification()]batch_tfms=TSStandardize()clf=TSClassifier(X,y,splits=splits,path='models',arch="InceptionTimePlus",tfms=tfms,batch_tfms=batch_tfms,metrics=accuracy,cbs=ShowGraph())clf.fit_one_cycle(100,3e-4)clf.export("clf.pkl")
Inference:
fromtsai.inferenceimportload_learnerclf=load_learner("models/clf.pkl")probas,target,preds=clf.get_X_preds(X[splits[1]],y[splits[1]])
Training:
fromtsai.basicsimport*X,y,splits=get_classification_data('LSST',split_data=False)tfms= [None,TSClassification()]batch_tfms=TSStandardize(by_sample=True)mv_clf=TSClassifier(X,y,splits=splits,path='models',arch="InceptionTimePlus",tfms=tfms,batch_tfms=batch_tfms,metrics=accuracy,cbs=ShowGraph())mv_clf.fit_one_cycle(10,1e-2)mv_clf.export("mv_clf.pkl")
Inference:
fromtsai.inferenceimportload_learnermv_clf=load_learner("models/mv_clf.pkl")probas,target,preds=mv_clf.get_X_preds(X[splits[1]],y[splits[1]])
Training:
fromtsai.basicsimport*X,y,splits=get_regression_data('AppliancesEnergy',split_data=False)tfms= [None,TSRegression()]batch_tfms=TSStandardize(by_sample=True)reg=TSRegressor(X,y,splits=splits,path='models',arch="TSTPlus",tfms=tfms,batch_tfms=batch_tfms,metrics=rmse,cbs=ShowGraph(),verbose=True)reg.fit_one_cycle(100,3e-4)reg.export("reg.pkl")
Inference:
fromtsai.inferenceimportload_learnerreg=load_learner("models/reg.pkl")raw_preds,target,preds=reg.get_X_preds(X[splits[1]],y[splits[1]])
The ROCKETs (RocketClassifier, RocketRegressor, MiniRocketClassifier,MiniRocketRegressor, MiniRocketVotingClassifier orMiniRocketVotingRegressor) are somewhat different models. They are notactually deep learning models (although they use convolutions) and areused in a different way.
pipinstallsktime
or use:
pipinstalltsai[extras]
Training:
fromsklearn.metricsimportmean_squared_error,make_scorerfromtsai.data.externalimportget_Monash_regression_datafromtsai.models.MINIROCKETimportMiniRocketRegressorX_train,y_train,*_=get_Monash_regression_data('AppliancesEnergy')rmse_scorer=make_scorer(mean_squared_error,greater_is_better=False)reg=MiniRocketRegressor(scoring=rmse_scorer)reg.fit(X_train,y_train)reg.save('MiniRocketRegressor')
Inference:
fromsklearn.metricsimportmean_squared_errorfromtsai.data.externalimportget_Monash_regression_datafromtsai.models.MINIROCKETimportload_minirocket*_,X_test,y_test=get_Monash_regression_data('AppliancesEnergy')reg=load_minirocket('MiniRocketRegressor')y_pred=reg.predict(X_test)mean_squared_error(y_test,y_pred,squared=False)
You can use tsai for forecast in the following scenarios:
- univariate or multivariate time series input
- univariate or multivariate time series output
- single or multi-step ahead
You’ll need to: * prepare X (time series input) and the target y (seedocumentation)* select PatchTST or one of tsai’s models ending in Plus (TSTPlus,InceptionTimePlus, TSiTPlus, etc). The model will auto-configure a headto yield an output with the same shape as the target input y.
Training:
fromtsai.basicsimport*ts=get_forecasting_time_series("Sunspots").valuesX,y=SlidingWindow(60,horizon=1)(ts)splits=TimeSplitter(235)(y)tfms= [None,TSForecasting()]batch_tfms=TSStandardize()fcst=TSForecaster(X,y,splits=splits,path='models',tfms=tfms,batch_tfms=batch_tfms,bs=512,arch="TSTPlus",metrics=mae,cbs=ShowGraph())fcst.fit_one_cycle(50,1e-3)fcst.export("fcst.pkl")
Inference:
fromtsai.inferenceimportload_learnerfcst=load_learner("models/fcst.pkl",cpu=False)raw_preds,target,preds=fcst.get_X_preds(X[splits[1]],y[splits[1]])raw_preds.shape# torch.Size([235, 1])
This example show how to build a 3-step ahead univariate forecast.
Training:
fromtsai.basicsimport*ts=get_forecasting_time_series("Sunspots").valuesX,y=SlidingWindow(60,horizon=3)(ts)splits=TimeSplitter(235,fcst_horizon=3)(y)tfms= [None,TSForecasting()]batch_tfms=TSStandardize()fcst=TSForecaster(X,y,splits=splits,path='models',tfms=tfms,batch_tfms=batch_tfms,bs=512,arch="TSTPlus",metrics=mae,cbs=ShowGraph())fcst.fit_one_cycle(50,1e-3)fcst.export("fcst.pkl")
Inference:
fromtsai.inferenceimportload_learnerfcst=load_learner("models/fcst.pkl",cpu=False)raw_preds,target,preds=fcst.get_X_preds(X[splits[1]],y[splits[1]])raw_preds.shape# torch.Size([235, 3])
The input format for all time series models and image models in tsai isthe same. An np.ndarray (or array-like object like zarr, etc) with 3dimensions:
[# samples x # variables x sequence length]
The input format for tabular models in tsai (like TabModel,TabTransformer and TabFusionTransformer) is a pandas dataframe. Seeexample.
We welcome contributions of all kinds. Development of enhancements, bugfixes, documentation, tutorial notebooks, …
We have created a guide to help you start contributing to tsai. You canread ithere.
Want to make the most out of timeseriesAI/tsai in a professionalsetting? Let us help. Send us an email to learn more:info@timeseriesai.co
If you use tsai in your research please use the following BibTeX entry:
@Misc{tsai, author = {Ignacio Oguiza}, title = {tsai - A state-of-the-art deep learning library for time series and sequential data}, howpublished = {Github}, year = {2023}, url = {https://github.com/timeseriesAI/tsai}}
About
Time series Timeseries Deep Learning Machine Learning Python Pytorch fastai | State-of-the-art Deep Learning library for Time Series and Sequences in Pytorch / fastai