TDSNN: From Deep Neural Networks to Deep Spike Neural Networks with Temporal-Coding
Authors
- Lei ZhangInstitute of Computing Technology of the Chinese Academy of Sciences
- Shengyuan ZhouUniversity of Chinese Academy of Sciences
- Tian ZhiInstitute of Computing Technology of the Chinese Academy of Sciences
- Zidong DuInstitute of Computing Technology of the Chinese Academy of Sciences
- Yunji ChenInstitute of Computing Technology, Chinese Academy of Sciences
DOI:
https://doi.org/10.1609/aaai.v33i01.33011319Abstract
Continuous-valued deep convolutional networks (DNNs) can be converted into accurate rate-coding based spike neural networks (SNNs). However, the substantial computational and energy costs, which is caused by multiple spikes, limit their use in mobile and embedded applications. And recent works have shown that the newly emerged temporal-coding based SNNs converted from DNNs can reduce the computational load effectively. In this paper, we propose a novel method to convert DNNs to temporal-coding SNNs, calledTDSNN. Combined with the characteristic of the leaky integrate-andfire (LIF) neural model, we put forward a new coding principleReverse Coding and design a novelTicking Neuron mechanism. According to our evaluation, our proposed method achieves 42% total operations reduction on average in large networks comparing with DNNs with no more than 0.5% accuracy loss. The evaluation shows that TDSNN may prove to be one of the key enablers to make the adoption of SNNs widespread.