TY - JOUR
T1 - Improved weight initialization for deep and narrow feedforward neural network
AU - Lee, Hyunwoo
AU - Kim, Yunho
AU - Yang, Seung Yeop
AU - Choi, Hayoung
N1 - Publisher Copyright:
© 2024 The Authors
PY - 2024/8
Y1 - 2024/8
N2 - Appropriate weight initialization settings, along with the ReLU activation function, have become cornerstones of modern deep learning, enabling the training and deployment of highly effective and efficient neural network models across diverse areas of artificial intelligence. The problem of “dying ReLU,” where ReLU neurons become inactive and yield zero output, presents a significant challenge in the training of deep neural networks with ReLU activation function. Theoretical research and various methods have been introduced to address the problem. However, even with these methods and research, training remains challenging for extremely deep and narrow feedforward networks with ReLU activation function. In this paper, we propose a novel weight initialization method to address this issue. We establish several properties of our initial weight matrix and demonstrate how these properties enable the effective propagation of signal vectors. Through a series of experiments and comparisons with existing methods, we demonstrate the effectiveness of the novel initialization method.
AB - Appropriate weight initialization settings, along with the ReLU activation function, have become cornerstones of modern deep learning, enabling the training and deployment of highly effective and efficient neural network models across diverse areas of artificial intelligence. The problem of “dying ReLU,” where ReLU neurons become inactive and yield zero output, presents a significant challenge in the training of deep neural networks with ReLU activation function. Theoretical research and various methods have been introduced to address the problem. However, even with these methods and research, training remains challenging for extremely deep and narrow feedforward networks with ReLU activation function. In this paper, we propose a novel weight initialization method to address this issue. We establish several properties of our initial weight matrix and demonstrate how these properties enable the effective propagation of signal vectors. Through a series of experiments and comparisons with existing methods, we demonstrate the effectiveness of the novel initialization method.
KW - Deep learning
KW - Feedforward neural networks
KW - Initial weight matrix
KW - ReLU activation function
KW - Weight initialization
UR - http://www.scopus.com/inward/record.url?scp=85192499052&partnerID=8YFLogxK
U2 - 10.1016/j.neunet.2024.106362
DO - 10.1016/j.neunet.2024.106362
M3 - Article
C2 - 38733795
AN - SCOPUS:85192499052
SN - 0893-6080
VL - 176
JO - Neural Networks
JF - Neural Networks
M1 - 106362
ER -