Elastic exponential linear units for convolutional neural networks

Daeho Kim, Jinah Kim, Jaeil Kim

Research output: Contribution to journalArticlepeer-review

40 Scopus citations

Abstract

Activation functions play important roles in determining the depth and non-linearity of deep learning models. Since the Rectified Linear Unit (ReLU) was introduced, many modifications, in which noise is intentionally injected, have been proposed to avoid overfitting. Exponential Linear Unit (ELU) and their variants, with trainable parameters, have been proposed to reduce the bias shift effects which are often observed in ReLU-type activation functions. In this paper, we propose a novel activation function, called the Elastic Exponential Linear Unit (EELU), which combines the advantages of both types of activation functions in a generalized form. EELU has an elastic slope in the positive part, and preserves the negative signal by using a small non-zero gradient. We also present a new strategy to insert neuronal noise using a Gaussian distribution in the activation function to improve generalization. We demonstrated how EELU can represent a wider variety of features with random noise than other activation functions, by visualizing the latent features of convolutional neural networks. We evaluated the effectiveness of the EELU approach through extensive experiments with image classification using the CIFAR-10/CIFAR-100, ImageNet, and Tiny ImageNet datasets. Our experimental results show that EELU achieved better generalization performance and improved classification accuracy over conventional activation functions, such as ReLU, ELU, ReLU- and ELU-like variants, Scaled ELU, and Swish. EELU produced performance improvements in image classification using a smaller number of training samples, owing to its noise injection strategy, which allows significant variation in function outputs, including deactivation.

Original languageEnglish
Pages (from-to)253-266
Number of pages14
JournalNeurocomputing
Volume406
DOIs
StatePublished - 17 Sep 2020

Keywords

  • Activation function
  • Convolutional neural network
  • Elastic Exponential Linear Unit (EELU)
  • ELU
  • Gaussian noise
  • ReLU

Fingerprint

Dive into the research topics of 'Elastic exponential linear units for convolutional neural networks'. Together they form a unique fingerprint.

Cite this