File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1080/002077200291145
- Scopus: eid_2-s2.0-0034161023
- WOS: WOS:000086140600003
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: A computation-efficient on-line training algorithm for neurofuzzy networks
Title | A computation-efficient on-line training algorithm for neurofuzzy networks |
---|---|
Authors | |
Issue Date | 2000 |
Publisher | Taylor & Francis Ltd. The Journal's web site is located at http://www.tandf.co.uk/journals/titles/00207721.asp |
Citation | International Journal Of Systems Science, 2000, v. 31 n. 3, p. 297-306 How to Cite? |
Abstract | Neurofuzzy networks are often used to model linear or nonlinear processes, as they can provide some insights into the underlying processes and can be trained using experimental data. As the training of the networks involves intensive computation, it is often performed off line. However, it is well known that neurofuzzy networks trained off line may not be able to cope successully with time-varying processes. To overcome this problem, the weights of the networks are trained on line. In this paper, an on-line training algorithm with a computation time that is linear in the number of weights is derived by making full use of the local change property of neurofuzzy networks. It is shown that the estimated weights converge to that obtained from the least-squares method, and that the range of the input domain can be extended without retraining the network. Furthermore, it has a better ability in tracking time-varying systems than the recursive least-squares method, since in the proposed algorithm a positive definite submatrix is added to the relevant part of the covariance matrix. The performance of the proposed algorithm is illustrated by simulation examples and compared with that obtained using the recursive least-squares method. |
Persistent Identifier | http://hdl.handle.net/10722/156553 |
ISSN | 2023 Impact Factor: 4.9 2023 SCImago Journal Rankings: 1.851 |
ISI Accession Number ID | |
References |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chan, CW | en_HK |
dc.contributor.author | Cheung, KC | en_HK |
dc.contributor.author | Yeung, WK | en_HK |
dc.date.accessioned | 2012-08-08T08:42:56Z | - |
dc.date.available | 2012-08-08T08:42:56Z | - |
dc.date.issued | 2000 | en_HK |
dc.identifier.citation | International Journal Of Systems Science, 2000, v. 31 n. 3, p. 297-306 | en_HK |
dc.identifier.issn | 0020-7721 | en_HK |
dc.identifier.uri | http://hdl.handle.net/10722/156553 | - |
dc.description.abstract | Neurofuzzy networks are often used to model linear or nonlinear processes, as they can provide some insights into the underlying processes and can be trained using experimental data. As the training of the networks involves intensive computation, it is often performed off line. However, it is well known that neurofuzzy networks trained off line may not be able to cope successully with time-varying processes. To overcome this problem, the weights of the networks are trained on line. In this paper, an on-line training algorithm with a computation time that is linear in the number of weights is derived by making full use of the local change property of neurofuzzy networks. It is shown that the estimated weights converge to that obtained from the least-squares method, and that the range of the input domain can be extended without retraining the network. Furthermore, it has a better ability in tracking time-varying systems than the recursive least-squares method, since in the proposed algorithm a positive definite submatrix is added to the relevant part of the covariance matrix. The performance of the proposed algorithm is illustrated by simulation examples and compared with that obtained using the recursive least-squares method. | en_HK |
dc.language | eng | en_US |
dc.publisher | Taylor & Francis Ltd. The Journal's web site is located at http://www.tandf.co.uk/journals/titles/00207721.asp | en_HK |
dc.relation.ispartof | International Journal of Systems Science | en_HK |
dc.title | A computation-efficient on-line training algorithm for neurofuzzy networks | en_HK |
dc.type | Article | en_HK |
dc.identifier.email | Chan, CW: mechan@hkucc.hku.hk | en_HK |
dc.identifier.email | Cheung, KC: kccheung@hkucc.hku.hk | en_HK |
dc.identifier.authority | Chan, CW=rp00088 | en_HK |
dc.identifier.authority | Cheung, KC=rp01322 | en_HK |
dc.description.nature | link_to_subscribed_fulltext | en_US |
dc.identifier.doi | 10.1080/002077200291145 | en_HK |
dc.identifier.scopus | eid_2-s2.0-0034161023 | en_HK |
dc.identifier.hkuros | 49552 | - |
dc.relation.references | http://www.scopus.com/mlt/select.url?eid=2-s2.0-0034161023&selection=ref&src=s&origin=recordpage | en_HK |
dc.identifier.volume | 31 | en_HK |
dc.identifier.issue | 3 | en_HK |
dc.identifier.spage | 297 | en_HK |
dc.identifier.epage | 306 | en_HK |
dc.identifier.isi | WOS:000086140600003 | - |
dc.publisher.place | United Kingdom | en_HK |
dc.identifier.scopusauthorid | Chan, CW=7404814060 | en_HK |
dc.identifier.scopusauthorid | Cheung, KC=7402406698 | en_HK |
dc.identifier.scopusauthorid | Yeung, WK=24345897100 | en_HK |
dc.identifier.issnl | 0020-7721 | - |