File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Stochastic hard thresholding algorithms for AUC maximization

TitleStochastic hard thresholding algorithms for AUC maximization
Authors
KeywordsArea Under the ROC Curve (AUC)
Imbalanced classification
Sparse learning
Stochastic hard thresholding
Issue Date2020
Citation
Proceedings - IEEE International Conference on Data Mining, ICDM, 2020, v. 2020-November, p. 741-750 How to Cite?
AbstractIn this paper, we aim to develop stochastic hard thresholding algorithms for the important problem of AUC maximization in imbalanced classification. The main challenge is the pairwise loss involved in AUC maximization. We overcome this obstacle by reformulating the U-statistics objective function as an empirical risk minimization (ERM), from which a stochastic hard thresholding algorithm (SHT-AUC) is developed. To our best knowledge, this is the first attempt to provide stochastic hard thresholding algorithms for AUC maximization with a per-iteration cost mathcal{O}(bd) where d and b are the dimension of the data and the minibatch size, respectively. We show that the proposed algorithm enjoys the linear convergence rate up to a tolerance error. In particular, we show, if the data is generated from the Gaussian distribution, then its convergence becomes slower as the data gets more imbalanced. We conduct extensive experiments to show the efficiency and effectiveness of the proposed algorithms.
Persistent Identifierhttp://hdl.handle.net/10722/329681
ISSN
2020 SCImago Journal Rankings: 0.545
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorYang, Zhenhuan-
dc.contributor.authorZhou, Baojian-
dc.contributor.authorLei, Yunwen-
dc.contributor.authorYing, Yiming-
dc.date.accessioned2023-08-09T03:34:34Z-
dc.date.available2023-08-09T03:34:34Z-
dc.date.issued2020-
dc.identifier.citationProceedings - IEEE International Conference on Data Mining, ICDM, 2020, v. 2020-November, p. 741-750-
dc.identifier.issn1550-4786-
dc.identifier.urihttp://hdl.handle.net/10722/329681-
dc.description.abstractIn this paper, we aim to develop stochastic hard thresholding algorithms for the important problem of AUC maximization in imbalanced classification. The main challenge is the pairwise loss involved in AUC maximization. We overcome this obstacle by reformulating the U-statistics objective function as an empirical risk minimization (ERM), from which a stochastic hard thresholding algorithm (SHT-AUC) is developed. To our best knowledge, this is the first attempt to provide stochastic hard thresholding algorithms for AUC maximization with a per-iteration cost mathcal{O}(bd) where d and b are the dimension of the data and the minibatch size, respectively. We show that the proposed algorithm enjoys the linear convergence rate up to a tolerance error. In particular, we show, if the data is generated from the Gaussian distribution, then its convergence becomes slower as the data gets more imbalanced. We conduct extensive experiments to show the efficiency and effectiveness of the proposed algorithms.-
dc.languageeng-
dc.relation.ispartofProceedings - IEEE International Conference on Data Mining, ICDM-
dc.subjectArea Under the ROC Curve (AUC)-
dc.subjectImbalanced classification-
dc.subjectSparse learning-
dc.subjectStochastic hard thresholding-
dc.titleStochastic hard thresholding algorithms for AUC maximization-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/ICDM50108.2020.00083-
dc.identifier.scopuseid_2-s2.0-85100873921-
dc.identifier.volume2020-November-
dc.identifier.spage741-
dc.identifier.epage750-
dc.identifier.isiWOS:000630177700073-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats