File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Fine-grained Generalization Analysis of Structured Output Prediction

TitleFine-grained Generalization Analysis of Structured Output Prediction
Authors
Issue Date2021
Citation
IJCAI International Joint Conference on Artificial Intelligence, 2021, p. 2841-2847 How to Cite?
AbstractIn machine learning we often encounter structured output prediction problems (SOPPs), i.e. problems where the output space admits a rich internal structure. Application domains where SOPPs naturally occur include natural language processing, speech recognition, and computer vision. Typical SOPPs have an extremely large label set, which grows exponentially as a function of the size of the output. Existing generalization analysis implies generalization bounds with at least a square-root dependency on the cardinality d of the label set, which can be vacuous in practice. In this paper, we significantly improve the state of the art by developing novel high-probability bounds with a logarithmic dependency on d. Moreover, we leverage the lens of algorithmic stability to develop generalization bounds in expectation without any dependency on d. Our results therefore build a solid theoretical foundation for learning in large-scale SOPPs. Furthermore, we extend our results to learning with weakly dependent data.
Persistent Identifierhttp://hdl.handle.net/10722/329786
ISSN
2020 SCImago Journal Rankings: 0.649

 

DC FieldValueLanguage
dc.contributor.authorMustafa, Waleed-
dc.contributor.authorLei, Yunwen-
dc.contributor.authorLedent, Antoine-
dc.contributor.authorKloft, Marius-
dc.date.accessioned2023-08-09T03:35:19Z-
dc.date.available2023-08-09T03:35:19Z-
dc.date.issued2021-
dc.identifier.citationIJCAI International Joint Conference on Artificial Intelligence, 2021, p. 2841-2847-
dc.identifier.issn1045-0823-
dc.identifier.urihttp://hdl.handle.net/10722/329786-
dc.description.abstractIn machine learning we often encounter structured output prediction problems (SOPPs), i.e. problems where the output space admits a rich internal structure. Application domains where SOPPs naturally occur include natural language processing, speech recognition, and computer vision. Typical SOPPs have an extremely large label set, which grows exponentially as a function of the size of the output. Existing generalization analysis implies generalization bounds with at least a square-root dependency on the cardinality d of the label set, which can be vacuous in practice. In this paper, we significantly improve the state of the art by developing novel high-probability bounds with a logarithmic dependency on d. Moreover, we leverage the lens of algorithmic stability to develop generalization bounds in expectation without any dependency on d. Our results therefore build a solid theoretical foundation for learning in large-scale SOPPs. Furthermore, we extend our results to learning with weakly dependent data.-
dc.languageeng-
dc.relation.ispartofIJCAI International Joint Conference on Artificial Intelligence-
dc.titleFine-grained Generalization Analysis of Structured Output Prediction-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.scopuseid_2-s2.0-85125488776-
dc.identifier.spage2841-
dc.identifier.epage2847-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats