Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

Deep and online learning with spiking neural networks in Python

License

NotificationsYou must be signed in to change notification settings

jeshraghian/snntorch

Repository files navigation

builddocsdiscordpypicondadownloads

The brain is the perfect place to look for inspiration to develop more efficient neural networks. One of the main differences with modern deep learning is that the brain encodes information in spikes rather than continuous activations.snnTorch is a Python package for performing gradient-based learning with spiking neural networks.It extends the capabilities of PyTorch, taking advantage of its GPU accelerated tensorcomputation and applying it to networks of spiking neurons. Pre-designed spiking neuron models are seamlessly integrated within the PyTorch framework and can be treated as recurrent activation units.

https://github.com/jeshraghian/snntorch/blob/master/docs/_static/img/spike_excite_alpha_ps2.gif?raw=true

If you like this project, please consider starring ⭐ this repo as it is the easiest and best way to support it.

If you have issues, comments, or are looking for advice on training spiking neural networks, you can open an issue, a discussion, or chat in ourdiscord channel.

snnTorch Structure

snnTorch contains the following components:

ComponentDescription
snntorcha spiking neuron library like torch.nn, deeply integrated with autograd
snntorch.export_nirenables exporting to other SNN libraries viaNIR
snntorch.functionalcommon arithmetic operations on spikes, e.g., loss, regularization etc.
snntorch.import_nirenables importing from other SNN libraries viaNIR
snntorch.spikegena library for spike generation and data conversion
snntorch.spikeplotvisualization tools for spike-based data using matplotlib and celluloid
snntorch.surrogateoptional surrogate gradient functions
snntorch.utilsdataset utility functions

snnTorch is designed to be intuitively used with PyTorch, as though each spiking neuron were simply another activation in a sequence of layers.It is therefore agnostic to fully-connected layers, convolutional layers, residual connections, etc.

At present, the neuron models are represented by recursive functions which removes the need to store membrane potential traces for all neurons in a system in order to calculate the gradient.The lean requirements of snnTorch enable small and large networks to be viably trained on CPU, where needed.Provided that the network models and tensors are loaded onto CUDA, snnTorch takes advantage of GPU acceleration in the same way as PyTorch.

Citation

If you find snnTorch useful in your work, please cite the following source:

Jason K. Eshraghian, Max Ward, Emre Neftci, Xinxin Wang, Gregor Lenz, GirishDwivedi, Mohammed Bennamoun, Doo Seok Jeong, and Wei D. Lu “TrainingSpiking Neural Networks Using Lessons From Deep Learning”. Proceedings of the IEEE, 111(9)September 2023.

@article{eshraghian2021training,        title   =  {Training spiking neural networks using lessons from deep learning},        author  =  {Eshraghian, Jason K and Ward, Max and Neftci, Emre and Wang, Xinxin                    and Lenz, Gregor and Dwivedi, Girish and Bennamoun, Mohammed and                   Jeong, Doo Seok and Lu, Wei D},        journal = {Proceedings of the IEEE},        volume  = {111},        number  = {9},        pages   = {1016--1054},        year    = {2023}}

Let us know if you are using snnTorch in any interesting work, research or blogs, as we would love to hear more about it! Reach out atsnntorch@gmail.com.

Requirements

PyTorch should be installed to use snnTorch. Ensure the correct version of torch is installed for your system to enable CUDA compatibility.

The following packages are automatically installed if using the pip command:

  • numpy
  • pandas

The following packages are required for using export_nir and import_nir:

  • nir
  • nirtorch

The following packages are required for using spikeplot:

  • matplotlib

Installation

Run the following to install:

$ python$ pip install snntorch

To install snnTorch from source instead:

$ git clone https://github.com/jeshraghian/snnTorch$ cd snntorch$ python setup.py install

To install snntorch with conda:

$ conda install -c conda-forge snntorch

To install for an Intelligent Processing Units (IPU) based build using Graphcore's accelerators:

$ pip install snntorch-ipu

API & Examples

A complete API is availablehere. Examples, tutorials and Colab notebooks are provided.

Quickstart

Open In Colab

Here are a few ways you can get started with snnTorch:

For a quick example to run snnTorch, see the following snippet, or test the quickstart notebook:

importtorch,torch.nnasnnimportsnntorchassnnfromsnntorchimportsurrogatefromsnntorchimportutilsnum_steps=25# number of time stepsbatch_size=1beta=0.5# neuron decay ratespike_grad=surrogate.fast_sigmoid()# surrogate gradientnet=nn.Sequential(nn.Conv2d(1,8,5),nn.MaxPool2d(2),snn.Leaky(beta=beta,init_hidden=True,spike_grad=spike_grad),nn.Conv2d(8,16,5),nn.MaxPool2d(2),snn.Leaky(beta=beta,init_hidden=True,spike_grad=spike_grad),nn.Flatten(),nn.Linear(16*4*4,10),snn.Leaky(beta=beta,init_hidden=True,spike_grad=spike_grad,output=True)      )data_in=torch.rand(num_steps,batch_size,1,28,28)# random input dataspike_recording= []# record spikes over timeutils.reset(net)# reset/initialize hidden states for all neuronsforstepinrange(num_steps):# loop over timespike,state=net(data_in[step])# one time step of forward-passspike_recording.append(spike)# record spikes in list

A Deep Dive into SNNs

If you wish to learn all the fundamentals of training spiking neural networks, from neuron models, to the neural code, up to backpropagation, the snnTorch tutorial series is a great place to begin.It consists of interactive notebooks with complete explanations that can get you up to speed.

TutorialTitleColab Link
Tutorial 1Spike Encoding with snnTorchOpen In Colab
Tutorial 2The Leaky Integrate and Fire NeuronOpen In Colab
Tutorial 3A Feedforward Spiking Neural NetworkOpen In Colab
Tutorial 42nd Order Spiking Neuron Models (Optional)Open In Colab
Tutorial 5Training Spiking Neural Networks with snnTorchOpen In Colab
Tutorial 6Surrogate Gradient Descent in a Convolutional SNNOpen In Colab
Tutorial 7Neuromorphic Datasets with Tonic + snnTorchOpen In Colab
Advanced TutorialsColab Link
Population CodingOpen In Colab
Regression: Part I - Membrane Potential Learning with LIF NeuronsOpen In Colab
Regression: Part II - Regression-based Classification with Recurrent LIF NeuronsOpen In Colab
Accelerating snnTorch on IPUs

Contributing

If you're ready to contribute to snnTorch, instructions to do so can befound here.

Acknowledgments

snnTorch is currently maintained by theUCSC Neuromorphic Computing Group. It was initially developed byJason K. Eshraghian in theLu Group (University of Michigan).

Additional contributions were made byVincent Sun,Peng Zhou,Ridger Zhu,Alexander Henkes,Steven Abreu, Xinxin Wang, Sreyes Venkatesh,gekkom, and Emre Neftci.

License & Copyright

snnTorch source code is published under the terms of the MIT License.snnTorch's documentation is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License (CC BY-SA 3.0).


[8]ページ先頭

©2009-2025 Movatter.jp