Computer Science > Machine Learning
arXiv:2401.13968 (cs)
[Submitted on 25 Jan 2024]
Title:Dynamic Long-Term Time-Series Forecasting via Meta Transformer Networks
Authors:Muhammad Anwar Ma'sum,MD Rasel Sarkar,Mahardhika Pratama,Savitha Ramasamy,Sreenatha Anavatti,Lin Liu,Habibullah,Ryszard Kowalczyk
View a PDF of the paper titled Dynamic Long-Term Time-Series Forecasting via Meta Transformer Networks, by Muhammad Anwar Ma'sum and 7 other authors
View PDFHTML (experimental)Abstract:A reliable long-term time-series forecaster is highly demanded in practice but comes across many challenges such as low computational and memory footprints as well as robustness against dynamic learning environments. This paper proposes Meta-Transformer Networks (MANTRA) to deal with the dynamic long-term time-series forecasting tasks. MANTRA relies on the concept of fast and slow learners where a collection of fast learners learns different aspects of data distributions while adapting quickly to changes. A slow learner tailors suitable representations to fast learners. Fast adaptations to dynamic environments are achieved using the universal representation transformer layers producing task-adapted representations with a small number of parameters. Our experiments using four datasets with different prediction lengths demonstrate the advantage of our approach with at least $3\%$ improvements over the baseline algorithms for both multivariate and univariate settings. Source codes of MANTRA are publicly available in \url{this https URL}.
Comments: | Under Consideration in IEEE Transactions on Artificial Intelligence |
Subjects: | Machine Learning (cs.LG); Artificial Intelligence (cs.AI) |
Cite as: | arXiv:2401.13968 [cs.LG] |
(orarXiv:2401.13968v1 [cs.LG] for this version) | |
https://doi.org/10.48550/arXiv.2401.13968 arXiv-issued DOI via DataCite |
Submission history
From: Mahardhika Pratama Assoc Prof [view email][v1] Thu, 25 Jan 2024 06:03:56 UTC (6,063 KB)
Full-text links:
Access Paper:
- View PDF
- HTML (experimental)
- TeX Source
- Other Formats
View a PDF of the paper titled Dynamic Long-Term Time-Series Forecasting via Meta Transformer Networks, by Muhammad Anwar Ma'sum and 7 other authors
References & Citations
Bibliographic and Citation Tools
Bibliographic Explorer(What is the Explorer?)
Connected Papers(What is Connected Papers?)
Litmaps(What is Litmaps?)
scite Smart Citations(What are Smart Citations?)
Code, Data and Media Associated with this Article
alphaXiv(What is alphaXiv?)
CatalyzeX Code Finder for Papers(What is CatalyzeX?)
DagsHub(What is DagsHub?)
Gotit.pub(What is GotitPub?)
Hugging Face(What is Huggingface?)
Papers with Code(What is Papers with Code?)
ScienceCast(What is ScienceCast?)
Demos
Recommenders and Search Tools
Influence Flower(What are Influence Flowers?)
CORE Recommender(What is CORE?)
IArxiv Recommender(What is IArxiv?)
arXivLabs: experimental projects with community collaborators
arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.
Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.
Have an idea for a project that will add value for arXiv's community?Learn more about arXivLabs.