File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

Article: On the performance analysis of the least mean M-estimate and normalized least mean M-estimate algorithms with Gaussian inputs and additive Gaussian and contaminated Gaussian noises

TitleOn the performance analysis of the least mean M-estimate and normalized least mean M-estimate algorithms with Gaussian inputs and additive Gaussian and contaminated Gaussian noises
Authors
KeywordsAdaptive filtering
Impulsive noise
Least mean square/M-estimate
Robust statistics
Issue Date2010
PublisherSpringer New York LLC. The Journal's web site is located at http://springerlink.metapress.com/content/120889/
Citation
Journal Of Signal Processing Systems, 2010, v. 60 n. 1, p. 81-103 How to Cite?
AbstractThis paper studies the convergence analysis of the least mean M-estimate (LMM) and normalized least mean M-estimate (NLMM) algorithms with Gaussian inputs and additive Gaussian and contaminated Gaussian noises. These algorithms are based on the M-estimate cost function and employ error nonlinearity to achieve improved robustness in impulsive noise environment over their conventional LMS and NLMS counterparts. Using the Price's theorem and an extension of the method proposed in Bershad (IEEE Transactions on Acoustics, Speech, and Signal Processing, ASSP-34(4), 793-806, 1986; IEEE Transactions on Acoustics, Speech, and Signal Processing, 35(5), 636-644, 1987), we first derive new expressions of the decoupled difference equations which describe the mean and mean square convergence behaviors of these algorithms for Gaussian inputs and additive Gaussian noise. These new expressions, which are expressed in terms of the generalized Abelian integral functions, closely resemble those for the LMS algorithm and allow us to interpret the convergence performance and determine the step size stability bound of the studied algorithms. Next, using an extension of the Price's theorem for Gaussian mixture, similar results are obtained for additive contaminated Gaussian noise case. The theoretical analysis and the practical advantages of the LMM/NLMM algorithms are verified through computer simulations. © 2009 Springer Science+Business Media, LLC.
Persistent Identifierhttp://hdl.handle.net/10722/124017
ISSN
2015 Impact Factor: 0.508
2015 SCImago Journal Rankings: 0.262
ISI Accession Number ID
References

 

DC FieldValueLanguage
dc.contributor.authorChan, SCen_HK
dc.contributor.authorZhou, Yen_HK
dc.date.accessioned2010-10-19T04:33:28Z-
dc.date.available2010-10-19T04:33:28Z-
dc.date.issued2010en_HK
dc.identifier.citationJournal Of Signal Processing Systems, 2010, v. 60 n. 1, p. 81-103en_HK
dc.identifier.issn1939-8018en_HK
dc.identifier.urihttp://hdl.handle.net/10722/124017-
dc.description.abstractThis paper studies the convergence analysis of the least mean M-estimate (LMM) and normalized least mean M-estimate (NLMM) algorithms with Gaussian inputs and additive Gaussian and contaminated Gaussian noises. These algorithms are based on the M-estimate cost function and employ error nonlinearity to achieve improved robustness in impulsive noise environment over their conventional LMS and NLMS counterparts. Using the Price's theorem and an extension of the method proposed in Bershad (IEEE Transactions on Acoustics, Speech, and Signal Processing, ASSP-34(4), 793-806, 1986; IEEE Transactions on Acoustics, Speech, and Signal Processing, 35(5), 636-644, 1987), we first derive new expressions of the decoupled difference equations which describe the mean and mean square convergence behaviors of these algorithms for Gaussian inputs and additive Gaussian noise. These new expressions, which are expressed in terms of the generalized Abelian integral functions, closely resemble those for the LMS algorithm and allow us to interpret the convergence performance and determine the step size stability bound of the studied algorithms. Next, using an extension of the Price's theorem for Gaussian mixture, similar results are obtained for additive contaminated Gaussian noise case. The theoretical analysis and the practical advantages of the LMM/NLMM algorithms are verified through computer simulations. © 2009 Springer Science+Business Media, LLC.en_HK
dc.languageengen_HK
dc.publisherSpringer New York LLC. The Journal's web site is located at http://springerlink.metapress.com/content/120889/en_HK
dc.relation.ispartofJournal of Signal Processing Systemsen_HK
dc.rightsSpringer Science+Business Media, LLCen_HK
dc.rightsCreative Commons: Attribution 3.0 Hong Kong License-
dc.subjectAdaptive filteringen_HK
dc.subjectImpulsive noiseen_HK
dc.subjectLeast mean square/M-estimateen_HK
dc.subjectRobust statisticsen_HK
dc.titleOn the performance analysis of the least mean M-estimate and normalized least mean M-estimate algorithms with Gaussian inputs and additive Gaussian and contaminated Gaussian noisesen_HK
dc.typeArticleen_HK
dc.identifier.emailChan, SC: ascchan@hkucc.hku.hken_HK
dc.identifier.emailZhou, Y: yizhou@eee.hku.hken_HK
dc.identifier.authorityChan, SC=rp00094en_HK
dc.identifier.authorityZhou, Y=rp00213en_HK
dc.description.naturepublished_or_final_version-
dc.identifier.doi10.1007/s11265-009-0405-9en_HK
dc.identifier.scopuseid_2-s2.0-77951257909en_HK
dc.identifier.hkuros183148-
dc.relation.referenceshttp://www.scopus.com/mlt/select.url?eid=2-s2.0-77951257909&selection=ref&src=s&origin=recordpageen_HK
dc.identifier.volume60en_HK
dc.identifier.issue1en_HK
dc.identifier.spage81en_HK
dc.identifier.epage103en_HK
dc.identifier.eissn1939-8115en_HK
dc.identifier.isiWOS:000276722700007-
dc.publisher.placeUnited Statesen_HK
dc.description.otherSpringer Open Choice, 01 Dec 2010-
dc.identifier.scopusauthoridChan, SC=13310287100en_HK
dc.identifier.scopusauthoridZhou, Y=55209555200en_HK
dc.identifier.citeulike6045027-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats