Precision Exploration of Floating-Point Arithmetic for Spiking Neural Networks

Myeongjin Kwak, Hyoju Seo, Yongtae Kim

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

In this paper, we explore the precision of various floating-point representations for energy-efficient spiking neural network (SNNs). The IEEE 754 based 32-bit single-precision floating-point and reduced precision floating-point formats are applied to the leaky integrate-And-fire (LIF) neuron of the SNN to investigate the impact the reduced precision on the accuracy performance. When adopted in an unsupervised two-layer SNN for the MNIST digit recognition application, the 16-bit floating-point formats can be used in training and inference of the SNN without any classification performance degradation. Additionally, our experimental result reveals that the floating-point format with 4-bit exponent and 6-bit mantissa is enough for the SNN training and inference and offers great area, power, and energy reductions compared to the others.

Original languageEnglish
Title of host publicationProceedings - International SoC Design Conference 2021, ISOCC 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages71-72
Number of pages2
ISBN (Electronic)9781665401746
DOIs
StatePublished - 2021
Event18th International System-on-Chip Design Conference, ISOCC 2021 - Jeju Island, Korea, Republic of
Duration: 6 Oct 20219 Oct 2021

Publication series

NameProceedings - International SoC Design Conference 2021, ISOCC 2021

Conference

Conference18th International System-on-Chip Design Conference, ISOCC 2021
Country/TerritoryKorea, Republic of
CityJeju Island
Period6/10/219/10/21

Keywords

  • floating-point adder
  • floating-point representation
  • neuromorphic computing
  • precision
  • spiking neural network (SNN)

Fingerprint

Dive into the research topics of 'Precision Exploration of Floating-Point Arithmetic for Spiking Neural Networks'. Together they form a unique fingerprint.

Cite this