File Download
 
Links for fulltext
(May Require Subscription)
 
Supplementary

Article: ML estimation for factor analysis: EM or non-EM?
  • Basic View
  • Metadata View
  • XML View
TitleML estimation for factor analysis: EM or non-EM?
 
AuthorsZhao, JH1 3
Yu, PLH1
Jiang, Q2
 
KeywordsCM
ECME
EM
Factor analysis
Maximum likelihood estimation
 
Issue Date2008
 
PublisherSpringer New York LLC. The Journal's web site is located at http://springerlink.metapress.com/openurl.asp?genre=journal&issn=0960-3174
 
CitationStatistics And Computing, 2008, v. 18 n. 2, p. 109-123 [How to Cite?]
DOI: http://dx.doi.org/10.1007/s11222-007-9042-y
 
AbstractTo obtain maximum likelihood (ML) estimation in factor analysis (FA), we propose in this paper a novel and fast conditional maximization (CM) algorithm, which has quadratic and monotone convergence, consisting of a sequence of CM log-likelihood (CML) steps. The main contribution of this algorithm is that the closed form expression for the parameter to be updated in each step can be obtained explicitly, without resorting to any numerical optimization methods. In addition, a new ECME algorithm similar to Liu's (Biometrika 81, 633-648, 1994) one is obtained as a by-product, which turns out to be very close to the simple iteration algorithm proposed by Lawley (Proc. R. Soc. Edinb. 60, 64-82, 1940) but our algorithm is guaranteed to increase log-likelihood at every iteration and hence to converge. Both algorithms inherit the simplicity and stability of EM but their convergence behaviors are much different as revealed in our extensive simulations: (1) In most situations, ECME and EM perform similarly; (2) CM outperforms EM and ECME substantially in all situations, no matter assessed by the CPU time or the number of iterations. Especially for the case close to the well known Heywood case, it accelerates EM by factors of around 100 or more. Also, CM is much more insensitive to the choice of starting values than EM and ECME. © 2007 Springer Science+Business Media, LLC.
 
ISSN0960-3174
2012 Impact Factor: 1.977
2012 SCImago Journal Rankings: 1.807
 
DOIhttp://dx.doi.org/10.1007/s11222-007-9042-y
 
ReferencesReferences in Scopus
 
DC FieldValue
dc.contributor.authorZhao, JH
 
dc.contributor.authorYu, PLH
 
dc.contributor.authorJiang, Q
 
dc.date.accessioned2010-09-06T08:33:45Z
 
dc.date.available2010-09-06T08:33:45Z
 
dc.date.issued2008
 
dc.description.abstractTo obtain maximum likelihood (ML) estimation in factor analysis (FA), we propose in this paper a novel and fast conditional maximization (CM) algorithm, which has quadratic and monotone convergence, consisting of a sequence of CM log-likelihood (CML) steps. The main contribution of this algorithm is that the closed form expression for the parameter to be updated in each step can be obtained explicitly, without resorting to any numerical optimization methods. In addition, a new ECME algorithm similar to Liu's (Biometrika 81, 633-648, 1994) one is obtained as a by-product, which turns out to be very close to the simple iteration algorithm proposed by Lawley (Proc. R. Soc. Edinb. 60, 64-82, 1940) but our algorithm is guaranteed to increase log-likelihood at every iteration and hence to converge. Both algorithms inherit the simplicity and stability of EM but their convergence behaviors are much different as revealed in our extensive simulations: (1) In most situations, ECME and EM perform similarly; (2) CM outperforms EM and ECME substantially in all situations, no matter assessed by the CPU time or the number of iterations. Especially for the case close to the well known Heywood case, it accelerates EM by factors of around 100 or more. Also, CM is much more insensitive to the choice of starting values than EM and ECME. © 2007 Springer Science+Business Media, LLC.
 
dc.description.natureLink_to_subscribed_fulltext
 
dc.identifier.citationStatistics And Computing, 2008, v. 18 n. 2, p. 109-123 [How to Cite?]
DOI: http://dx.doi.org/10.1007/s11222-007-9042-y
 
dc.identifier.citeulike10318479
 
dc.identifier.doihttp://dx.doi.org/10.1007/s11222-007-9042-y
 
dc.identifier.eissn1573-1375
 
dc.identifier.epage123
 
dc.identifier.hkuros145601
 
dc.identifier.issn0960-3174
2012 Impact Factor: 1.977
2012 SCImago Journal Rankings: 1.807
 
dc.identifier.issue2
 
dc.identifier.openurl
 
dc.identifier.scopuseid_2-s2.0-41549124138
 
dc.identifier.spage109
 
dc.identifier.urihttp://hdl.handle.net/10722/82818
 
dc.identifier.volume18
 
dc.languageeng
 
dc.publisherSpringer New York LLC. The Journal's web site is located at http://springerlink.metapress.com/openurl.asp?genre=journal&issn=0960-3174
 
dc.publisher.placeUnited States
 
dc.relation.ispartofStatistics and Computing
 
dc.relation.referencesReferences in Scopus
 
dc.subjectCM
 
dc.subjectECME
 
dc.subjectEM
 
dc.subjectFactor analysis
 
dc.subjectMaximum likelihood estimation
 
dc.titleML estimation for factor analysis: EM or non-EM?
 
dc.typeArticle
 
<?xml encoding="utf-8" version="1.0"?>
<item><contributor.author>Zhao, JH</contributor.author>
<contributor.author>Yu, PLH</contributor.author>
<contributor.author>Jiang, Q</contributor.author>
<date.accessioned>2010-09-06T08:33:45Z</date.accessioned>
<date.available>2010-09-06T08:33:45Z</date.available>
<date.issued>2008</date.issued>
<identifier.citation>Statistics And Computing, 2008, v. 18 n. 2, p. 109-123</identifier.citation>
<identifier.issn>0960-3174</identifier.issn>
<identifier.uri>http://hdl.handle.net/10722/82818</identifier.uri>
<description.abstract>To obtain maximum likelihood (ML) estimation in factor analysis (FA), we propose in this paper a novel and fast conditional maximization (CM) algorithm, which has quadratic and monotone convergence, consisting of a sequence of CM log-likelihood (CML) steps. The main contribution of this algorithm is that the closed form expression for the parameter to be updated in each step can be obtained explicitly, without resorting to any numerical optimization methods. In addition, a new ECME algorithm similar to Liu&apos;s (Biometrika 81, 633-648, 1994) one is obtained as a by-product, which turns out to be very close to the simple iteration algorithm proposed by Lawley (Proc. R. Soc. Edinb. 60, 64-82, 1940) but our algorithm is guaranteed to increase log-likelihood at every iteration and hence to converge. Both algorithms inherit the simplicity and stability of EM but their convergence behaviors are much different as revealed in our extensive simulations: (1) In most situations, ECME and EM perform similarly; (2) CM outperforms EM and ECME substantially in all situations, no matter assessed by the CPU time or the number of iterations. Especially for the case close to the well known Heywood case, it accelerates EM by factors of around 100 or more. Also, CM is much more insensitive to the choice of starting values than EM and ECME. &#169; 2007 Springer Science+Business Media, LLC.</description.abstract>
<language>eng</language>
<publisher>Springer New York LLC. The Journal&apos;s web site is located at http://springerlink.metapress.com/openurl.asp?genre=journal&amp;issn=0960-3174</publisher>
<relation.ispartof>Statistics and Computing</relation.ispartof>
<subject>CM</subject>
<subject>ECME</subject>
<subject>EM</subject>
<subject>Factor analysis</subject>
<subject>Maximum likelihood estimation</subject>
<title>ML estimation for factor analysis: EM or non-EM?</title>
<type>Article</type>
<identifier.openurl>http://library.hku.hk:4550/resserv?sid=HKU:IR&amp;issn=0960-3174&amp;volume=18&amp;spage=109&amp;epage=123&amp;date=2008&amp;atitle=ML+estimation+for+factor+analysis:+EM+or+non-EM?+</identifier.openurl>
<description.nature>Link_to_subscribed_fulltext</description.nature>
<identifier.doi>10.1007/s11222-007-9042-y</identifier.doi>
<identifier.scopus>eid_2-s2.0-41549124138</identifier.scopus>
<identifier.hkuros>145601</identifier.hkuros>
<relation.references>http://www.scopus.com/mlt/select.url?eid=2-s2.0-41549124138&amp;selection=ref&amp;src=s&amp;origin=recordpage</relation.references>
<identifier.volume>18</identifier.volume>
<identifier.issue>2</identifier.issue>
<identifier.spage>109</identifier.spage>
<identifier.epage>123</identifier.epage>
<identifier.eissn>1573-1375</identifier.eissn>
<publisher.place>United States</publisher.place>
<identifier.citeulike>10318479</identifier.citeulike>
</item>
Author Affiliations
  1. The University of Hong Kong
  2. Southeast University
  3. Yunnan University