File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.csda.2007.11.007
- Scopus: eid_2-s2.0-40249087061
- WOS: WOS:000255145900018
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Efficient methods for estimating constrained parameters with applications to regularized (lasso) logistic regression
Title | Efficient methods for estimating constrained parameters with applications to regularized (lasso) logistic regression |
---|---|
Authors | |
Issue Date | 2008 |
Publisher | Elsevier BV. The Journal's web site is located at http://www.elsevier.com/locate/csda |
Citation | Computational Statistics And Data Analysis, 2008, v. 52 n. 7, p. 3528-3542 How to Cite? |
Abstract | Fitting logistic regression models is challenging when their parameters are restricted. In this article, we first develop a quadratic lower-bound (QLB) algorithm for optimization with box or linear inequality constraints and derive the fastest QLB algorithm corresponding to the smallest global majorization matrix. The proposed QLB algorithm is particularly suited to problems to which the EM-type algorithms are not applicable (e.g., logistic, multinomial logistic, and Cox's proportional hazards models) while it retains the same EM ascent property and thus assures the monotonic convergence. Secondly, we generalize the QLB algorithm to penalized problems in which the penalty functions may not be totally differentiable. The proposed method thus provides an alternative algorithm for estimation in lasso logistic regression, where the convergence of the existing lasso algorithm is not generally ensured. Finally, by relaxing the ascent requirement, convergence speed can be further accelerated. We introduce a pseudo-Newton method that retains the simplicity of the QLB algorithm and the fast convergence of the Newton method. Theoretical justification and numerical examples show that the pseudo-Newton method is up to 71 (in terms of CPU time) or 107 (in terms of number of iterations) times faster than the fastest QLB algorithm and thus makes bootstrap variance estimation feasible. Simulations and comparisons are performed and three real examples (Down syndrome data, kyphosis data, and colon microarray data) are analyzed to illustrate the proposed methods. © 2007 Elsevier Ltd. All rights reserved. |
Persistent Identifier | http://hdl.handle.net/10722/59856 |
ISSN | 2023 Impact Factor: 1.5 2023 SCImago Journal Rankings: 1.008 |
ISI Accession Number ID | |
References |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Tian, GL | en_HK |
dc.contributor.author | Tang, ML | en_HK |
dc.contributor.author | Fang, HB | en_HK |
dc.contributor.author | Tan, M | en_HK |
dc.date.accessioned | 2010-05-31T03:58:52Z | - |
dc.date.available | 2010-05-31T03:58:52Z | - |
dc.date.issued | 2008 | en_HK |
dc.identifier.citation | Computational Statistics And Data Analysis, 2008, v. 52 n. 7, p. 3528-3542 | en_HK |
dc.identifier.issn | 0167-9473 | en_HK |
dc.identifier.uri | http://hdl.handle.net/10722/59856 | - |
dc.description.abstract | Fitting logistic regression models is challenging when their parameters are restricted. In this article, we first develop a quadratic lower-bound (QLB) algorithm for optimization with box or linear inequality constraints and derive the fastest QLB algorithm corresponding to the smallest global majorization matrix. The proposed QLB algorithm is particularly suited to problems to which the EM-type algorithms are not applicable (e.g., logistic, multinomial logistic, and Cox's proportional hazards models) while it retains the same EM ascent property and thus assures the monotonic convergence. Secondly, we generalize the QLB algorithm to penalized problems in which the penalty functions may not be totally differentiable. The proposed method thus provides an alternative algorithm for estimation in lasso logistic regression, where the convergence of the existing lasso algorithm is not generally ensured. Finally, by relaxing the ascent requirement, convergence speed can be further accelerated. We introduce a pseudo-Newton method that retains the simplicity of the QLB algorithm and the fast convergence of the Newton method. Theoretical justification and numerical examples show that the pseudo-Newton method is up to 71 (in terms of CPU time) or 107 (in terms of number of iterations) times faster than the fastest QLB algorithm and thus makes bootstrap variance estimation feasible. Simulations and comparisons are performed and three real examples (Down syndrome data, kyphosis data, and colon microarray data) are analyzed to illustrate the proposed methods. © 2007 Elsevier Ltd. All rights reserved. | en_HK |
dc.language | eng | en_HK |
dc.publisher | Elsevier BV. The Journal's web site is located at http://www.elsevier.com/locate/csda | en_HK |
dc.relation.ispartof | Computational Statistics and Data Analysis | en_HK |
dc.title | Efficient methods for estimating constrained parameters with applications to regularized (lasso) logistic regression | en_HK |
dc.type | Article | en_HK |
dc.identifier.email | Tian, GL: gltian@hku.hk | en_HK |
dc.identifier.authority | Tian, GL=rp00789 | en_HK |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1016/j.csda.2007.11.007 | en_HK |
dc.identifier.scopus | eid_2-s2.0-40249087061 | en_HK |
dc.identifier.hkuros | 163559 | en_HK |
dc.relation.references | http://www.scopus.com/mlt/select.url?eid=2-s2.0-40249087061&selection=ref&src=s&origin=recordpage | en_HK |
dc.identifier.volume | 52 | en_HK |
dc.identifier.issue | 7 | en_HK |
dc.identifier.spage | 3528 | en_HK |
dc.identifier.epage | 3542 | en_HK |
dc.identifier.isi | WOS:000255145900018 | - |
dc.publisher.place | Netherlands | en_HK |
dc.identifier.scopusauthorid | Tian, GL=25621549400 | en_HK |
dc.identifier.scopusauthorid | Tang, ML=7401974011 | en_HK |
dc.identifier.scopusauthorid | Fang, HB=7402543028 | en_HK |
dc.identifier.scopusauthorid | Tan, M=7401464906 | en_HK |
dc.identifier.issnl | 0167-9473 | - |