TY - JOUR
T1 - ReLU network with bounded width is a universal approximator in view of an approximate identity
AU - Moon, Sunghwan
N1 - Publisher Copyright:
© 2021 by the author. Licensee MDPI, Basel, Switzerland.
PY - 2021/1/1
Y1 - 2021/1/1
N2 - Deep neural networks have shown very successful performance in a wide range of tasks, but a theory of why they work so well is in the early stage. Recently, the expressive power of neural networks, important for understanding deep learning, has received considerable attention. Classic results, provided by Cybenko, Barron, etc., state that a network with a single hidden layer and suitable activation functions is a universal approximator. A few years ago, one started to study how width affects the expressiveness of neural networks, i.e., a universal approximation theorem for a deep neural network with a Rectified Linear Unit (ReLU) activation function and bounded width. Here, we show how any continuous function on a compact set of Rnin, nin ∈ N can be approximated by a ReLU network having hidden layers with at most nin + 5 nodes in view of an approximate identity.
AB - Deep neural networks have shown very successful performance in a wide range of tasks, but a theory of why they work so well is in the early stage. Recently, the expressive power of neural networks, important for understanding deep learning, has received considerable attention. Classic results, provided by Cybenko, Barron, etc., state that a network with a single hidden layer and suitable activation functions is a universal approximator. A few years ago, one started to study how width affects the expressiveness of neural networks, i.e., a universal approximation theorem for a deep neural network with a Rectified Linear Unit (ReLU) activation function and bounded width. Here, we show how any continuous function on a compact set of Rnin, nin ∈ N can be approximated by a ReLU network having hidden layers with at most nin + 5 nodes in view of an approximate identity.
KW - A feed-forward neural network
KW - Deep neural nets
KW - ReLU network
KW - Universal approximation theory
UR - http://www.scopus.com/inward/record.url?scp=85099852336&partnerID=8YFLogxK
U2 - 10.3390/app11010427
DO - 10.3390/app11010427
M3 - Article
AN - SCOPUS:85099852336
SN - 2076-3417
VL - 11
SP - 1
EP - 11
JO - Applied Sciences (Switzerland)
JF - Applied Sciences (Switzerland)
IS - 1
M1 - 427
ER -