File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.asoc.2012.04.034
- Scopus: eid_2-s2.0-84885640902
- WOS: WOS:000319205200055
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: A new optimization algorithm for single hidden layer feedforward neural networks
Title | A new optimization algorithm for single hidden layer feedforward neural networks |
---|---|
Authors | |
Keywords | Evolutionary Algorithm Feedforward Neural Networks Training Of Neural Networks |
Issue Date | 2013 |
Publisher | Elsevier BV. The Journal's web site is located at http://www.elsevier.com/locate/asoc |
Citation | Applied Soft Computing Journal, 2013, v. 13 n. 5, p. 2857-2862 How to Cite? |
Abstract | Feedforward neural networks are the most commonly used function approximation techniques in neural networks. By the universal approximation theorem, it is clear that a single-hidden layer feedforward neural network (FNN) is sufficient to approximate the corresponding desired outputs arbitrarily close. Some researchers use genetic algorithms (GAs) to explore the global optimal solution of the FNN structure. However, it is rather time consuming to use GA for the training of FNN. In this paper, we propose a new optimization algorithm for a single-hidden layer FNN. The method is based on the convex combination algorithm for massaging information in the hidden layer. In fact, this technique explores a continuum idea which combines the classic mutation and crossover strategies in GA together. The proposed method has the advantage over GA which requires a lot of preprocessing works in breaking down the data into a sequence of binary codes before learning or mutation can apply. Also, we set up a new error function to measure the performance of the FNN and obtain the optimal choice of the connection weights and thus the nonlinear optimization problem can be solved directly. Several computational experiments are used to illustrate the proposed algorithm, which has good exploration and exploitation capabilities in search of the optimal weight for single hidden layer FNNs. © 2012. |
Persistent Identifier | http://hdl.handle.net/10722/155965 |
ISSN | 2023 Impact Factor: 7.2 2023 SCImago Journal Rankings: 1.843 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Li, LK | en_US |
dc.contributor.author | Shao, S | en_US |
dc.contributor.author | Yiu, KFC | en_US |
dc.date.accessioned | 2012-08-08T08:38:39Z | - |
dc.date.available | 2012-08-08T08:38:39Z | - |
dc.date.issued | 2013 | en_US |
dc.identifier.citation | Applied Soft Computing Journal, 2013, v. 13 n. 5, p. 2857-2862 | en_US |
dc.identifier.issn | 1568-4946 | en_US |
dc.identifier.uri | http://hdl.handle.net/10722/155965 | - |
dc.description.abstract | Feedforward neural networks are the most commonly used function approximation techniques in neural networks. By the universal approximation theorem, it is clear that a single-hidden layer feedforward neural network (FNN) is sufficient to approximate the corresponding desired outputs arbitrarily close. Some researchers use genetic algorithms (GAs) to explore the global optimal solution of the FNN structure. However, it is rather time consuming to use GA for the training of FNN. In this paper, we propose a new optimization algorithm for a single-hidden layer FNN. The method is based on the convex combination algorithm for massaging information in the hidden layer. In fact, this technique explores a continuum idea which combines the classic mutation and crossover strategies in GA together. The proposed method has the advantage over GA which requires a lot of preprocessing works in breaking down the data into a sequence of binary codes before learning or mutation can apply. Also, we set up a new error function to measure the performance of the FNN and obtain the optimal choice of the connection weights and thus the nonlinear optimization problem can be solved directly. Several computational experiments are used to illustrate the proposed algorithm, which has good exploration and exploitation capabilities in search of the optimal weight for single hidden layer FNNs. © 2012. | en_US |
dc.language | eng | en_US |
dc.publisher | Elsevier BV. The Journal's web site is located at http://www.elsevier.com/locate/asoc | en_US |
dc.relation.ispartof | Applied Soft Computing Journal | en_US |
dc.subject | Evolutionary Algorithm | en_US |
dc.subject | Feedforward Neural Networks | en_US |
dc.subject | Training Of Neural Networks | en_US |
dc.title | A new optimization algorithm for single hidden layer feedforward neural networks | en_US |
dc.type | Article | en_US |
dc.identifier.email | Yiu, KFC:cedric@hkucc.hku.hk | en_US |
dc.identifier.authority | Yiu, KFC=rp00206 | en_US |
dc.description.nature | link_to_subscribed_fulltext | en_US |
dc.identifier.doi | 10.1016/j.asoc.2012.04.034 | en_US |
dc.identifier.scopus | eid_2-s2.0-84885640902 | en_US |
dc.identifier.isi | WOS:000319205200055 | - |
dc.publisher.place | Netherlands | en_US |
dc.identifier.scopusauthorid | Li, LK=7501445268 | en_US |
dc.identifier.scopusauthorid | Shao, S=7102636557 | en_US |
dc.identifier.scopusauthorid | Yiu, KFC=24802813000 | en_US |
dc.identifier.citeulike | 10719488 | - |
dc.identifier.issnl | 1568-4946 | - |