TY - GEN
T1 - Do Not Forget
T2 - 40th IEEE International Conference on Computer Design, ICCD 2022
AU - Kwak, Myeongjin
AU - Kim, Yongtae
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - This paper presents a novel early training termination technique that significantly improves the training speed and energy efficiency of unsupervised learning-based spiking neural networks (SNNs) by skipping redundant training samples. To achieve early termination, we leveraged the key observation that unsupervised SNNs tend to stably maintain previously learned information and systematically analyze the spike firing activity of the network during training. To make a training termination decision, we exploit the difference between the number of spikes generated by the previous and current input training samples. Our termination algorithm is adopted in an SNN using the spike-timing-dependent plasticity (STDP) learning rule for a pattern classification application. The proposed scheme made an early termination decision with insignificant accuracy performance loss by adequately ignoring redundant training samples. Specifically, it enhances the training speedup and energy efficiency by up to 5.07 × and 5.14 × , respectively, with less than 1 percent points (pp) accuracy loss compared to the baseline counterparts by skipping up to 80% of the training samples. Additionally, when employed in a VLSI-based neuromorphic chip environment, it exhibits up to 4.95 × better energy efficiency than the baseline.
AB - This paper presents a novel early training termination technique that significantly improves the training speed and energy efficiency of unsupervised learning-based spiking neural networks (SNNs) by skipping redundant training samples. To achieve early termination, we leveraged the key observation that unsupervised SNNs tend to stably maintain previously learned information and systematically analyze the spike firing activity of the network during training. To make a training termination decision, we exploit the difference between the number of spikes generated by the previous and current input training samples. Our termination algorithm is adopted in an SNN using the spike-timing-dependent plasticity (STDP) learning rule for a pattern classification application. The proposed scheme made an early termination decision with insignificant accuracy performance loss by adequately ignoring redundant training samples. Specifically, it enhances the training speedup and energy efficiency by up to 5.07 × and 5.14 × , respectively, with less than 1 percent points (pp) accuracy loss compared to the baseline counterparts by skipping up to 80% of the training samples. Additionally, when employed in a VLSI-based neuromorphic chip environment, it exhibits up to 4.95 × better energy efficiency than the baseline.
KW - adaptive membrane threshold
KW - early training termination
KW - leaky integrate-and-fire (LIF) neuron
KW - spike-timing-dependent plasticity (STDP)
KW - spiking neural network (SNN)
UR - http://www.scopus.com/inward/record.url?scp=85145884574&partnerID=8YFLogxK
U2 - 10.1109/ICCD56317.2022.00069
DO - 10.1109/ICCD56317.2022.00069
M3 - Conference contribution
AN - SCOPUS:85145884574
T3 - Proceedings - IEEE International Conference on Computer Design: VLSI in Computers and Processors
SP - 419
EP - 426
BT - Proceedings - 2022 IEEE 40th International Conference on Computer Design, ICCD 2022
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 23 October 2022 through 26 October 2022
ER -