Tutorial on brain-inspired computing part 2: Multilayer perceptron and natural gradient learning

Research output: Contribution to specialist publicationArticle

7 Scopus citations

Abstract

Since the perceptron was developed for learning to classify input patterns, there have been plenty of studies on simple perceptrons and multilayer perceptrons. Despite wide and active studies in theory and applications, multilayer perceptrons still have many unsettled problems such as slow learning speed and overfitting. To find a thorough solution to these problems, it is necessary to consolidate previous studies, and find new directions for uplifting the practical power of multilayer perceptrons. As a first step toward the new stage of studies on multilayer perceptrons, we give short reviews on two interesting and important approaches; one is stochastic approach and the other is geometric approach. We also explain an efficient learning algorithm developed from the statistical and geometrical studies, which is now well known as the natural gradient learning method.

Original languageEnglish
Pages79-95
Number of pages17
Volume24
No1
Specialist publicationNew Generation Computing
DOIs
StatePublished - 2006

Keywords

  • Backpropagation learning
  • Gradient decent learning
  • Multilayer perceptrons
  • Natural gradient
  • Neuromanifold
  • Singularity
  • Stochastic neural networks

Fingerprint

Dive into the research topics of 'Tutorial on brain-inspired computing part 2: Multilayer perceptron and natural gradient learning'. Together they form a unique fingerprint.

Cite this