Solving local minima problem with large number of hidden nodes on two-layered feed-forward artificial neural networks

Bumghi Choi, Ju Hong Lee, Deok Hwan Kim

Research output: Contribution to journalArticlepeer-review

63 Scopus citations

Abstract

The gradient descent algorithms like backpropagation (BP) or its variations on multi-layered feed-forward networks are widely used in many applications. However, the most serious problem associated with the BP is local minima problem. Especially, an exceeding number of hidden nodes make the corresponding network deepen the local minima problem. We propose an algorithm which shows stable performance on training despite of the large number of hidden nodes. This algorithm is called separate learning algorithm in which hidden-to-output and input-to-hidden separately trained. Simulations on some benchmark problems have been performed to demonstrate the validity of the proposed method.

Original languageEnglish
Pages (from-to)3640-3643
Number of pages4
JournalNeurocomputing
Volume71
Issue number16-18
DOIs
StatePublished - Oct 2008

Keywords

  • Backpropagation
  • Hidden nodes
  • Local minima
  • Separate learning
  • Target values

Fingerprint

Dive into the research topics of 'Solving local minima problem with large number of hidden nodes on two-layered feed-forward artificial neural networks'. Together they form a unique fingerprint.

Cite this