File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1016/j.ins.2017.04.022
- Scopus: eid_2-s2.0-85017645900
- WOS: WOS:000401886700005
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Online pairwise learning algorithms with convex loss functions
Title | Online pairwise learning algorithms with convex loss functions |
---|---|
Authors | |
Keywords | Learning theory Online learning Pairwise learning Reproducing Kernel Hilbert Space |
Issue Date | 2017 |
Citation | Information Sciences, 2017, v. 406-407, p. 57-70 How to Cite? |
Abstract | Online pairwise learning algorithms with general convex loss functions without regularization in a Reproducing Kernel Hilbert Space (RKHS) are investigated. Under mild conditions on loss functions and the RKHS, upper bounds for the expected excess generalization error are derived in terms of the approximation error when the stepsize sequence decays polynomially. In particular, for Lipschitz loss functions such as the hinge loss, the logistic loss and the absolute-value loss, the bounds can be of order O(formula omitted) after T iterations, while for the least squares loss, the bounds can be of order O(formula omitted) In comparison with previous works for these algorithms, a broader family of convex loss functions is studied here, and refined upper bounds are obtained. |
Persistent Identifier | http://hdl.handle.net/10722/329438 |
ISSN | 2022 Impact Factor: 8.1 2023 SCImago Journal Rankings: 2.238 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Lin, Junhong | - |
dc.contributor.author | Lei, Yunwen | - |
dc.contributor.author | Zhang, Bo | - |
dc.contributor.author | Zhou, Ding Xuan | - |
dc.date.accessioned | 2023-08-09T03:32:47Z | - |
dc.date.available | 2023-08-09T03:32:47Z | - |
dc.date.issued | 2017 | - |
dc.identifier.citation | Information Sciences, 2017, v. 406-407, p. 57-70 | - |
dc.identifier.issn | 0020-0255 | - |
dc.identifier.uri | http://hdl.handle.net/10722/329438 | - |
dc.description.abstract | Online pairwise learning algorithms with general convex loss functions without regularization in a Reproducing Kernel Hilbert Space (RKHS) are investigated. Under mild conditions on loss functions and the RKHS, upper bounds for the expected excess generalization error are derived in terms of the approximation error when the stepsize sequence decays polynomially. In particular, for Lipschitz loss functions such as the hinge loss, the logistic loss and the absolute-value loss, the bounds can be of order O(formula omitted) after T iterations, while for the least squares loss, the bounds can be of order O(formula omitted) In comparison with previous works for these algorithms, a broader family of convex loss functions is studied here, and refined upper bounds are obtained. | - |
dc.language | eng | - |
dc.relation.ispartof | Information Sciences | - |
dc.subject | Learning theory | - |
dc.subject | Online learning | - |
dc.subject | Pairwise learning | - |
dc.subject | Reproducing Kernel Hilbert Space | - |
dc.title | Online pairwise learning algorithms with convex loss functions | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1016/j.ins.2017.04.022 | - |
dc.identifier.scopus | eid_2-s2.0-85017645900 | - |
dc.identifier.volume | 406-407 | - |
dc.identifier.spage | 57 | - |
dc.identifier.epage | 70 | - |
dc.identifier.isi | WOS:000401886700005 | - |