Low complexity adaptive forgetting factor for online sequential extreme learning machine (OS-ELM) for application to nonstationary system estimations

Jun seok Lim, Seokjin Lee, Hee Suk Pang

Research output: Contribution to journalArticlepeer-review

53 Scopus citations

Abstract

Huang et al. (2004) has recently proposed an on-line sequential ELM (OS-ELM) that enables the extreme learning machine (ELM) to train data one-by-one as well as chunk-by-chunk. OS-ELM is based on recursive least squares-type algorithm that uses a constant forgetting factor. In OS-ELM, the parameters of the hidden nodes are randomly selected and the output weights are determined based on the sequentially arriving data. However, OS-ELM using a constant forgetting factor cannot provide satisfactory performance in time-varying or nonstationary environments. Therefore, we propose an algorithm for the OS-ELM with an adaptive forgetting factor that maintains good performance in time-varying or nonstationary environments. The proposed algorithm has the following advantages: (1) the proposed adaptive forgetting factor requires minimal additional complexity of O(N) where N is the number of hidden neurons, and (2) the proposed algorithm with the adaptive forgetting factor is comparable with the conventional OS-ELM with an optimal forgetting factor.

Original languageEnglish
Pages (from-to)569-576
Number of pages8
JournalNeural Computing and Applications
Volume22
Issue number3-4
DOIs
StatePublished - Mar 2013

Keywords

  • Extreme learning machine
  • OS-ELM
  • RLS adaptive forgetting factor

Fingerprint

Dive into the research topics of 'Low complexity adaptive forgetting factor for online sequential extreme learning machine (OS-ELM) for application to nonstationary system estimations'. Together they form a unique fingerprint.

Cite this