Abstract
Neural networks have shown high successful performance in a wide range of tasks, but further studies are needed to improve its performance. We construct the specific neural network architecture with a local connection which is universal approximator, and analyze its approximation error. This locally connected networks has higher application than one with the full connection because the locally connected network can be used to explain diverse neural networks such as CNNs. Our error estimate depends on two parameters: one controlling the depth of the hidden layer, and the other, the width of the hidden layers.
Original language | English |
---|---|
Pages (from-to) | 275-288 |
Number of pages | 14 |
Journal | Japan Journal of Industrial and Applied Mathematics |
Volume | 40 |
Issue number | 1 |
DOIs | |
State | Published - Jan 2023 |
Keywords
- Approximation theory
- Cybenko
- Deep neural nets
- ReLU network