Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

A collection of 25+ PyTorch-compatible implementations of recurrent layers

License

NotificationsYou must be signed in to change notification settings

MartinuzziFrancesco/torchrecurrent

Repository files navigation

TorchRecurrent

PyPIcodecovBuildDocs!python-versionscode style: blackarXiv

TorchRecurrent is a PyTorch-compatible collection of recurrent neural networkcells and layers from across the research literature. It aims to providea unified, flexible interface that feels like native PyTorch while exposingmore customization options.

Installation

pip install torchrecurrent

or on conda-forge

conda install -c conda-forge torchrecurrent

Features

  • 🔄30+ recurrent cells (e.g.LSTMCell,GRUCell, and many specialized variants).
  • 🏗️30+ recurrent layers (e.g.LSTM,GRU, and counterparts for each cell).
  • 🧩Unified API — all cells/layers follow the PyTorch interface but add extra optionsfor initialization and customization.
  • 📚Comprehensive documentation including API reference and a catalog of published models.

👉 Full model catalog:torchrecurrent Models

Quick Example

importtorchfromtorchrecurrentimportMGU#minimal gated unit# sequence: (time_steps, batch, input_size)inp=torch.randn(5,3,10)# initialize a MGU with hidden_size=20rnn=MGU(input_size=10,hidden_size=20,num_layers=3)# forward passout,hidden=rnn(inp)print(out.shape)# (time_steps, batch, hidden_size)

Citation

If you use TorchRecurrent in your work, please consider citing

@misc{martinuzzi2025unified,doi ={10.48550/ARXIV.2510.21252},url ={https://arxiv.org/abs/2510.21252},author ={Martinuzzi,  Francesco},keywords ={Machine Learning (cs.LG),  Software Engineering (cs.SE),  FOS: Computer and information sciences,  FOS: Computer and information sciences},title ={Unified Implementations of Recurrent Neural Networks in Multiple Deep Learning Frameworks},publisher ={arXiv},year ={2025},copyright ={Creative Commons Attribution 4.0 International}}

See also

LuxRecurrentLayers.jl:Provides recurrent layers for Lux.jl in Julia.

RecurrentLayers.jl:Provides recurrent layers for Flux.jl in Julia.

ReservoirComputing.jl:Reservoir computing utilities for scientific machine learning.Essentially gradient free trained recurrent neural networks.

License

This project’s own code is distributed under the MIT License (seeLICENSE). The primary intent of this software is academic research.

Third-party Attributions

Some cells are re-implementations of published methods that carry their own licenses:

Please consult each of those licenses for your obligations when using this code in commercial or closed-source settings.

⚠️Disclaimer: TorchRecurrent is an independent project and is not affiliatedwith the PyTorch project or Meta AI. The name reflects compatibility with PyTorch,not any official endorsement.


[8]ページ先頭

©2009-2025 Movatter.jp