Error bounds for ReLU networks with depth and width parameters

Research output: Contribution to journalArticlepeer-review

Abstract

Neural networks have shown high successful performance in a wide range of tasks, but further studies are needed to improve its performance. We construct the specific neural network architecture with a local connection which is universal approximator, and analyze its approximation error. This locally connected networks has higher application than one with the full connection because the locally connected network can be used to explain diverse neural networks such as CNNs. Our error estimate depends on two parameters: one controlling the depth of the hidden layer, and the other, the width of the hidden layers.

Original languageEnglish
Pages (from-to)275-288
Number of pages14
JournalJapan Journal of Industrial and Applied Mathematics
Volume40
Issue number1
DOIs
StatePublished - Jan 2023

Keywords

  • Approximation theory
  • Cybenko
  • Deep neural nets
  • ReLU network

Fingerprint

Dive into the research topics of 'Error bounds for ReLU networks with depth and width parameters'. Together they form a unique fingerprint.

Cite this