File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Structure Guided Deep Neural Network for Unsupervised Active Learning

TitleStructure Guided Deep Neural Network for Unsupervised Active Learning
Authors
Keywordsdeep neural network
imbalance data
self-supervised learning
structure preserving
Unsupervised active learning
Issue Date2022
Citation
IEEE Transactions on Image Processing, 2022, v. 31, p. 2767-2781 How to Cite?
AbstractUnsupervised active learning has become an active research topic in the machine learning and computer vision communities, whose goal is to choose a subset of representative samples to be labeled in an unsupervised setting. Most of existing approaches rely on shallow linear models by assuming that each sample can be well approximated by the span (i.e., the set of all linear combinations) of the selected samples, and then take these selected samples as the representative ones for manual labeling. However, the data do not necessarily conform to the linear models in many real-world scenarios, and how to model nonlinearity of data often becomes the key point of unsupervised active learning. Moreover, the existing works often aim to well reconstruct the whole dataset, while ignore the important cluster structure, especially for imbalanced data. In this paper, we present a novel deep unsupervised active learning framework. The proposed method can explicitly learn a nonlinear embedding to map each input into a latent space via a deep neural network, and introduce a selection block to select the representative samples in the learnt latent space through a self-supervised learning strategy. In the selection block, we aim to not only preserve the global structure of the data, but also capture the cluster structure of the data in order to well handle the data imbalance issue during sample selection. Meanwhile, we take advantage of the clustering result to provide self-supervised information to guide the above processes. Finally, we attempt to preserve the local structure of the data, such that the data embedding becomes more precise and the model performance can be further improved. Extensive experimental results on several publicly available datasets clearly demonstrate the effectiveness of our method, compared with the state-of-the-arts.
Persistent Identifierhttp://hdl.handle.net/10722/321985
ISSN
2023 Impact Factor: 10.8
2023 SCImago Journal Rankings: 3.556
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorLi, Changsheng-
dc.contributor.authorMa, Handong-
dc.contributor.authorYuan, Ye-
dc.contributor.authorWang, Guoren-
dc.contributor.authorXu, Dong-
dc.date.accessioned2022-11-03T02:22:48Z-
dc.date.available2022-11-03T02:22:48Z-
dc.date.issued2022-
dc.identifier.citationIEEE Transactions on Image Processing, 2022, v. 31, p. 2767-2781-
dc.identifier.issn1057-7149-
dc.identifier.urihttp://hdl.handle.net/10722/321985-
dc.description.abstractUnsupervised active learning has become an active research topic in the machine learning and computer vision communities, whose goal is to choose a subset of representative samples to be labeled in an unsupervised setting. Most of existing approaches rely on shallow linear models by assuming that each sample can be well approximated by the span (i.e., the set of all linear combinations) of the selected samples, and then take these selected samples as the representative ones for manual labeling. However, the data do not necessarily conform to the linear models in many real-world scenarios, and how to model nonlinearity of data often becomes the key point of unsupervised active learning. Moreover, the existing works often aim to well reconstruct the whole dataset, while ignore the important cluster structure, especially for imbalanced data. In this paper, we present a novel deep unsupervised active learning framework. The proposed method can explicitly learn a nonlinear embedding to map each input into a latent space via a deep neural network, and introduce a selection block to select the representative samples in the learnt latent space through a self-supervised learning strategy. In the selection block, we aim to not only preserve the global structure of the data, but also capture the cluster structure of the data in order to well handle the data imbalance issue during sample selection. Meanwhile, we take advantage of the clustering result to provide self-supervised information to guide the above processes. Finally, we attempt to preserve the local structure of the data, such that the data embedding becomes more precise and the model performance can be further improved. Extensive experimental results on several publicly available datasets clearly demonstrate the effectiveness of our method, compared with the state-of-the-arts.-
dc.languageeng-
dc.relation.ispartofIEEE Transactions on Image Processing-
dc.subjectdeep neural network-
dc.subjectimbalance data-
dc.subjectself-supervised learning-
dc.subjectstructure preserving-
dc.subjectUnsupervised active learning-
dc.titleStructure Guided Deep Neural Network for Unsupervised Active Learning-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/TIP.2022.3161076-
dc.identifier.pmid35344492-
dc.identifier.scopuseid_2-s2.0-85127467551-
dc.identifier.volume31-
dc.identifier.spage2767-
dc.identifier.epage2781-
dc.identifier.eissn1941-0042-
dc.identifier.isiWOS:000778905000001-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats