Abstract
Since the perceptron was developed for learning to classify input patterns, there have been plenty of studies on simple perceptrons and multilayer perceptrons. Despite wide and active studies in theory and applications, multilayer perceptrons still have many unsettled problems such as slow learning speed and overfitting. To find a thorough solution to these problems, it is necessary to consolidate previous studies, and find new directions for uplifting the practical power of multilayer perceptrons. As a first step toward the new stage of studies on multilayer perceptrons, we give short reviews on two interesting and important approaches; one is stochastic approach and the other is geometric approach. We also explain an efficient learning algorithm developed from the statistical and geometrical studies, which is now well known as the natural gradient learning method.
Original language | English |
---|---|
Pages | 79-95 |
Number of pages | 17 |
Volume | 24 |
No | 1 |
Specialist publication | New Generation Computing |
DOIs | |
State | Published - 2006 |
Keywords
- Backpropagation learning
- Gradient decent learning
- Multilayer perceptrons
- Natural gradient
- Neuromanifold
- Singularity
- Stochastic neural networks