File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Collaborative and Adversarial Network for Unsupervised Domain Adaptation

TitleCollaborative and Adversarial Network for Unsupervised Domain Adaptation
Authors
Issue Date2018
Citation
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2018, p. 3801-3809 How to Cite?
AbstractIn this paper, we propose a new unsupervised domain adaptation approach called Collaborative and Adversarial Network (CAN) through domain-collaborative and domain-adversarial training of neural networks. We add several domain classifiers on multiple CNN feature extraction blocks1, in which each domain classifier is connected to the hidden representations from one block and one loss function is defined based on the hidden presentation and the domain labels (e.g., source and target). We design a new loss function by integrating the losses from all blocks in order to learn domain informative representations from lower blocks through collaborative learning and learn domain uninformative representations from higher blocks through adversarial learning. We further extend our CAN method as Incremental CAN (iCAN), in which we iteratively select a set of pseudo-labelled target samples based on the image classifier and the last domain classifier from the previous training epoch and re-train our CAN model by using the enlarged training set. Comprehensive experiments on two benchmark datasets Office and ImageCLEF-DA clearly demonstrate the effectiveness of our newly proposed approaches CAN and iCAN for unsupervised domain adaptation.
Persistent Identifierhttp://hdl.handle.net/10722/321826
ISSN
2020 SCImago Journal Rankings: 4.658
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorZhang, Weichen-
dc.contributor.authorOuyang, Wanli-
dc.contributor.authorLi, Wen-
dc.contributor.authorXu, Dong-
dc.date.accessioned2022-11-03T02:21:43Z-
dc.date.available2022-11-03T02:21:43Z-
dc.date.issued2018-
dc.identifier.citationProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2018, p. 3801-3809-
dc.identifier.issn1063-6919-
dc.identifier.urihttp://hdl.handle.net/10722/321826-
dc.description.abstractIn this paper, we propose a new unsupervised domain adaptation approach called Collaborative and Adversarial Network (CAN) through domain-collaborative and domain-adversarial training of neural networks. We add several domain classifiers on multiple CNN feature extraction blocks1, in which each domain classifier is connected to the hidden representations from one block and one loss function is defined based on the hidden presentation and the domain labels (e.g., source and target). We design a new loss function by integrating the losses from all blocks in order to learn domain informative representations from lower blocks through collaborative learning and learn domain uninformative representations from higher blocks through adversarial learning. We further extend our CAN method as Incremental CAN (iCAN), in which we iteratively select a set of pseudo-labelled target samples based on the image classifier and the last domain classifier from the previous training epoch and re-train our CAN model by using the enlarged training set. Comprehensive experiments on two benchmark datasets Office and ImageCLEF-DA clearly demonstrate the effectiveness of our newly proposed approaches CAN and iCAN for unsupervised domain adaptation.-
dc.languageeng-
dc.relation.ispartofProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition-
dc.titleCollaborative and Adversarial Network for Unsupervised Domain Adaptation-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/CVPR.2018.00400-
dc.identifier.scopuseid_2-s2.0-85058099493-
dc.identifier.spage3801-
dc.identifier.epage3809-
dc.identifier.isiWOS:000457843603098-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats