File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Semi-supervised learning with scarce annotations

TitleSemi-supervised learning with scarce annotations
Authors
Issue Date2020
Citation
IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2020, v. 2020-June, p. 3294-3302 How to Cite?
AbstractWhile semi-supervised learning (SSL) algorithms provide an efficient way to make use of both labelled and unlabelled data, they generally struggle when the number of annotated samples is very small. In this work, we consider the problem of SSL multi-class classification with very few labelled instances. We introduce two key ideas. The first is a simple but effective one: we leverage the power of transfer learning among different tasks and self-supervision to initialize a good representation of the data without making use of any label. The second idea is a new algorithm for SSL that can exploit well such a pre-trained representation. The algorithm works by alternating two phases, one fitting the labelled points and one fitting the unlabelled ones, with carefully-controlled information flow between them. The benefits are greatly reducing overfitting of the labelled data and avoiding issue with balancing labelled and unlabelled losses during training. We show empirically that this method can successfully train competitive models with as few as 10 labelled data points per class. More in general, we show that the idea of bootstrapping features using self-supervised learning always improves SSL on standard benchmarks. We show that our algorithm works increasingly well compared to other methods when refining from other tasks or datasets.
Persistent Identifierhttp://hdl.handle.net/10722/311489
ISSN
2020 SCImago Journal Rankings: 1.122
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorRebuffi, Sylvestre Alvise-
dc.contributor.authorEhrhardt, Sebastien-
dc.contributor.authorHan, Kai-
dc.contributor.authorVedaldi, Andrea-
dc.contributor.authorZisserman, Andrew-
dc.date.accessioned2022-03-22T11:54:03Z-
dc.date.available2022-03-22T11:54:03Z-
dc.date.issued2020-
dc.identifier.citationIEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2020, v. 2020-June, p. 3294-3302-
dc.identifier.issn2160-7508-
dc.identifier.urihttp://hdl.handle.net/10722/311489-
dc.description.abstractWhile semi-supervised learning (SSL) algorithms provide an efficient way to make use of both labelled and unlabelled data, they generally struggle when the number of annotated samples is very small. In this work, we consider the problem of SSL multi-class classification with very few labelled instances. We introduce two key ideas. The first is a simple but effective one: we leverage the power of transfer learning among different tasks and self-supervision to initialize a good representation of the data without making use of any label. The second idea is a new algorithm for SSL that can exploit well such a pre-trained representation. The algorithm works by alternating two phases, one fitting the labelled points and one fitting the unlabelled ones, with carefully-controlled information flow between them. The benefits are greatly reducing overfitting of the labelled data and avoiding issue with balancing labelled and unlabelled losses during training. We show empirically that this method can successfully train competitive models with as few as 10 labelled data points per class. More in general, we show that the idea of bootstrapping features using self-supervised learning always improves SSL on standard benchmarks. We show that our algorithm works increasingly well compared to other methods when refining from other tasks or datasets.-
dc.languageeng-
dc.relation.ispartofIEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops-
dc.titleSemi-supervised learning with scarce annotations-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/CVPRW50498.2020.00389-
dc.identifier.scopuseid_2-s2.0-85090133302-
dc.identifier.volume2020-June-
dc.identifier.spage3294-
dc.identifier.epage3302-
dc.identifier.eissn2160-7516-
dc.identifier.isiWOS:000788279003042-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats