Abstract
In hierarchical models, such as neural networks, there exist complex singular structures. The singularity is known to affect estimation performances and learning dynamics of the models. Recently, there have been a number of studies on properties of obtained estimators for the models, but there are few studies on the dynamical properties of learning used for obtaining the estimators. Using two-layer neural networks, we investigate influences of singularities on dynamics of standard gradient learning and natural gradient learning under various learning conditions. In the standard gradient learning, we found a quasi-plateau phenomenon, which is severer than the well known plateau in some cases. The slow convergence due to the quasi-plateau and plateau becomes extremely serious when an optimal point is in a neighborhood of a singularity. In the natural gradient learning, however, the quasi-plateau and plateau are not observed and convergence speed is hardly affected by singularity.
Original language | English |
---|---|
Pages (from-to) | 282-291 |
Number of pages | 10 |
Journal | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
Volume | 3157 |
DOIs | |
State | Published - 2004 |
Event | 8th Pacific Rim International Conference on Artificial Intelligence, PRICAI 2004: Trends in Artificial Intelligence - Auckland, New Zealand Duration: 9 Aug 2004 → 13 Aug 2004 |