File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Online sufficient dimension reduction through sliced inverse regression

TitleOnline sufficient dimension reduction through sliced inverse regression
Authors
KeywordsDimension reduction
Gradient descent
Online learning
Perturbation
Singular value decomposition
Sliced inverse regression
Issue Date2020
Citation
Journal of Machine Learning Research, 2020, v. 21 How to Cite?
AbstractSliced inverse regression is an effective paradigm that achieves the goal of dimension reduction through replacing high dimensional covariates with a small number of linear combinations. It does not impose parametric assumptions on the dependence structure. More importantly, such a reduction of dimension is sufficient in that it does not cause loss of information. In this paper, we adapt the stationary sliced inverse regression to cope with the rapidly changing environments. We propose to implement sliced inverse regression in an online fashion. This online learner consists of two steps. In the first step we construct an online estimate for the kernel matrix; in the second step we propose two online algorithms, one is motivated by the perturbation method and the other is originated from the gradient descent optimization, to perform online singular value decomposition. The theoretical properties of this online learner are established. We demonstrate the numerical performance of this online learner through simulations and real world applications. All numerical studies confirm that this online learner performs as well as the batch learner.
Persistent Identifierhttp://hdl.handle.net/10722/328782
ISSN
2023 Impact Factor: 4.3
2023 SCImago Journal Rankings: 2.796

 

DC FieldValueLanguage
dc.contributor.authorCai, Zhanrui-
dc.contributor.authorLi, Runze-
dc.contributor.authorZhu, Liping-
dc.date.accessioned2023-07-22T06:23:58Z-
dc.date.available2023-07-22T06:23:58Z-
dc.date.issued2020-
dc.identifier.citationJournal of Machine Learning Research, 2020, v. 21-
dc.identifier.issn1532-4435-
dc.identifier.urihttp://hdl.handle.net/10722/328782-
dc.description.abstractSliced inverse regression is an effective paradigm that achieves the goal of dimension reduction through replacing high dimensional covariates with a small number of linear combinations. It does not impose parametric assumptions on the dependence structure. More importantly, such a reduction of dimension is sufficient in that it does not cause loss of information. In this paper, we adapt the stationary sliced inverse regression to cope with the rapidly changing environments. We propose to implement sliced inverse regression in an online fashion. This online learner consists of two steps. In the first step we construct an online estimate for the kernel matrix; in the second step we propose two online algorithms, one is motivated by the perturbation method and the other is originated from the gradient descent optimization, to perform online singular value decomposition. The theoretical properties of this online learner are established. We demonstrate the numerical performance of this online learner through simulations and real world applications. All numerical studies confirm that this online learner performs as well as the batch learner.-
dc.languageeng-
dc.relation.ispartofJournal of Machine Learning Research-
dc.subjectDimension reduction-
dc.subjectGradient descent-
dc.subjectOnline learning-
dc.subjectPerturbation-
dc.subjectSingular value decomposition-
dc.subjectSliced inverse regression-
dc.titleOnline sufficient dimension reduction through sliced inverse regression-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-85086798987-
dc.identifier.volume21-
dc.identifier.eissn1533-7928-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats