File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: A plug-in approach to neyman-pearson classification

TitleA plug-in approach to neyman-pearson classification
Authors
KeywordsAnomaly detection
Neyman-Pearson paradigm
Nonparametric statistics
Oracle inequality
Plug-in approach
Issue Date2013
Citation
Journal of Machine Learning Research, 2013, v. 14, p. 3011-3040 How to Cite?
AbstractThe Neyman-Pearson (NP) paradigm in binary classification treats type I and type II errors with different priorities. It seeks classifiers that minimize type II error, subject to a type I error constraint under a user specified level a. In this paper, plug-in classifiers are developed under the NP paradigm. Based on the fundamental Neyman-Pearson Lemma, we propose two related plug-in classifiers which amount to thresholding respectively the class conditional density ratio and the regression function. These two classifiers handle different sampling schemes. This work focuses on theoretical properties of the proposed classifiers; in particular, we derive oracle inequalities that can be viewed as finite sample versions of risk bounds. NP classification can be used to address anomaly detection problems, where asymmetry in errors is an intrinsic property. As opposed to a common practice in anomaly detection that consists of thresholding normal class density, our approach does not assume a specific form for anomaly distributions. Such consideration is particularly necessary when the anomaly class density is far from uniformly distributed. © 2013 Xin Tong.
Persistent Identifierhttp://hdl.handle.net/10722/354110
ISSN
2023 Impact Factor: 4.3
2023 SCImago Journal Rankings: 2.796

 

DC FieldValueLanguage
dc.contributor.authorTong, Xin-
dc.date.accessioned2025-02-07T08:46:31Z-
dc.date.available2025-02-07T08:46:31Z-
dc.date.issued2013-
dc.identifier.citationJournal of Machine Learning Research, 2013, v. 14, p. 3011-3040-
dc.identifier.issn1532-4435-
dc.identifier.urihttp://hdl.handle.net/10722/354110-
dc.description.abstractThe Neyman-Pearson (NP) paradigm in binary classification treats type I and type II errors with different priorities. It seeks classifiers that minimize type II error, subject to a type I error constraint under a user specified level a. In this paper, plug-in classifiers are developed under the NP paradigm. Based on the fundamental Neyman-Pearson Lemma, we propose two related plug-in classifiers which amount to thresholding respectively the class conditional density ratio and the regression function. These two classifiers handle different sampling schemes. This work focuses on theoretical properties of the proposed classifiers; in particular, we derive oracle inequalities that can be viewed as finite sample versions of risk bounds. NP classification can be used to address anomaly detection problems, where asymmetry in errors is an intrinsic property. As opposed to a common practice in anomaly detection that consists of thresholding normal class density, our approach does not assume a specific form for anomaly distributions. Such consideration is particularly necessary when the anomaly class density is far from uniformly distributed. © 2013 Xin Tong.-
dc.languageeng-
dc.relation.ispartofJournal of Machine Learning Research-
dc.subjectAnomaly detection-
dc.subjectNeyman-Pearson paradigm-
dc.subjectNonparametric statistics-
dc.subjectOracle inequality-
dc.subjectPlug-in approach-
dc.titleA plug-in approach to neyman-pearson classification-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-84887469287-
dc.identifier.volume14-
dc.identifier.spage3011-
dc.identifier.epage3040-
dc.identifier.eissn1533-7928-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats