- Notifications
You must be signed in to change notification settings - Fork0
A collection of 25+ PyTorch-compatible implementations of recurrent layers
License
MartinuzziFrancesco/torchrecurrent
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
TorchRecurrent is a PyTorch-compatible collection of recurrent neural networkcells and layers from across the research literature. It aims to providea unified, flexible interface that feels like native PyTorch while exposingmore customization options.
pip install torchrecurrent
or on conda-forge
conda install -c conda-forge torchrecurrent
- 🔄30+ recurrent cells (e.g.
LSTMCell,GRUCell, and many specialized variants). - 🏗️30+ recurrent layers (e.g.
LSTM,GRU, and counterparts for each cell). - 🧩Unified API — all cells/layers follow the PyTorch interface but add extra optionsfor initialization and customization.
- 📚Comprehensive documentation including API reference and a catalog of published models.
👉 Full model catalog:torchrecurrent Models
importtorchfromtorchrecurrentimportMGU#minimal gated unit# sequence: (time_steps, batch, input_size)inp=torch.randn(5,3,10)# initialize a MGU with hidden_size=20rnn=MGU(input_size=10,hidden_size=20,num_layers=3)# forward passout,hidden=rnn(inp)print(out.shape)# (time_steps, batch, hidden_size)
If you use TorchRecurrent in your work, please consider citing
@misc{martinuzzi2025unified,doi ={10.48550/ARXIV.2510.21252},url ={https://arxiv.org/abs/2510.21252},author ={Martinuzzi, Francesco},keywords ={Machine Learning (cs.LG), Software Engineering (cs.SE), FOS: Computer and information sciences, FOS: Computer and information sciences},title ={Unified Implementations of Recurrent Neural Networks in Multiple Deep Learning Frameworks},publisher ={arXiv},year ={2025},copyright ={Creative Commons Attribution 4.0 International}}
LuxRecurrentLayers.jl:Provides recurrent layers for Lux.jl in Julia.
RecurrentLayers.jl:Provides recurrent layers for Flux.jl in Julia.
ReservoirComputing.jl:Reservoir computing utilities for scientific machine learning.Essentially gradient free trained recurrent neural networks.
This project’s own code is distributed under the MIT License (seeLICENSE). The primary intent of this software is academic research.
Some cells are re-implementations of published methods that carry their own licenses:
- NASCell: originally available under Apache 2.0 — seeLICENSE-Apache2.0.txt.
Please consult each of those licenses for your obligations when using this code in commercial or closed-source settings.
⚠️ Disclaimer: TorchRecurrent is an independent project and is not affiliatedwith the PyTorch project or Meta AI. The name reflects compatibility with PyTorch,not any official endorsement.
About
A collection of 25+ PyTorch-compatible implementations of recurrent layers
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
