File Download

There are no files associated with this item.

Supplementary

Conference Paper: Risk Bounds for Over-parameterized Maximum Margin Classification on Sub-Gaussian Mixtures

TitleRisk Bounds for Over-parameterized Maximum Margin Classification on Sub-Gaussian Mixtures
Authors
KeywordsMaximum margin classification
Over-parameterization
Benign overfitting
Issue Date2021
PublisherNeurIPS.
Citation
Thirty-fifth Confernece on Neural Information Processing Systems (NeurIPS) 2021 Online Conference, December 6-14, 2021. In Advances in Neural Information Processing Systems: 35th conference on neural information processing systems (NeurlIPS 2021), v. 34, p. 8407-8418 How to Cite?
AbstractModern machine learning systems such as deep neural networks are often highly over-parameterized so that they can fit the noisy training data exactly, yet they can still achieve small test errors in practice. In this paper, we study this 'benign overfitting' phenomenon of the maximum margin classifier for linear classification problems. Specifically, we consider data generated from sub-Gaussian mixtures, and provide a tight risk bound for the maximum margin linear classifier in the over-parameterized setting. Our results precisely characterize the condition under which benign overfitting can occur in linear classification problems, and improve on previous work. They also have direct implications for over-parameterized logistic regression.
DescriptionPoster presentations
Persistent Identifierhttp://hdl.handle.net/10722/314543

 

DC FieldValueLanguage
dc.contributor.authorCao, Y-
dc.contributor.authorGu, Q-
dc.contributor.authorBelkin, M-
dc.date.accessioned2022-07-22T05:26:32Z-
dc.date.available2022-07-22T05:26:32Z-
dc.date.issued2021-
dc.identifier.citationThirty-fifth Confernece on Neural Information Processing Systems (NeurIPS) 2021 Online Conference, December 6-14, 2021. In Advances in Neural Information Processing Systems: 35th conference on neural information processing systems (NeurlIPS 2021), v. 34, p. 8407-8418-
dc.identifier.urihttp://hdl.handle.net/10722/314543-
dc.descriptionPoster presentations-
dc.description.abstractModern machine learning systems such as deep neural networks are often highly over-parameterized so that they can fit the noisy training data exactly, yet they can still achieve small test errors in practice. In this paper, we study this 'benign overfitting' phenomenon of the maximum margin classifier for linear classification problems. Specifically, we consider data generated from sub-Gaussian mixtures, and provide a tight risk bound for the maximum margin linear classifier in the over-parameterized setting. Our results precisely characterize the condition under which benign overfitting can occur in linear classification problems, and improve on previous work. They also have direct implications for over-parameterized logistic regression.-
dc.languageeng-
dc.publisherNeurIPS.-
dc.relation.ispartofAdvances in Neural Information Processing Systems: 35th conference on neural information processing systems (NeurlIPS 2021)-
dc.subjectMaximum margin classification-
dc.subjectOver-parameterization-
dc.subjectBenign overfitting-
dc.titleRisk Bounds for Over-parameterized Maximum Margin Classification on Sub-Gaussian Mixtures-
dc.typeConference_Paper-
dc.identifier.emailCao, Y: yuancao@hku.hk-
dc.identifier.authorityCao, Y=rp02862-
dc.identifier.hkuros334651-
dc.identifier.volume34-
dc.identifier.spage8407-
dc.identifier.epage8418-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats