File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TIP.2022.3161076
- Scopus: eid_2-s2.0-85127467551
- PMID: 35344492
- WOS: WOS:000778905000001
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Structure Guided Deep Neural Network for Unsupervised Active Learning
Title | Structure Guided Deep Neural Network for Unsupervised Active Learning |
---|---|
Authors | |
Keywords | deep neural network imbalance data self-supervised learning structure preserving Unsupervised active learning |
Issue Date | 2022 |
Citation | IEEE Transactions on Image Processing, 2022, v. 31, p. 2767-2781 How to Cite? |
Abstract | Unsupervised active learning has become an active research topic in the machine learning and computer vision communities, whose goal is to choose a subset of representative samples to be labeled in an unsupervised setting. Most of existing approaches rely on shallow linear models by assuming that each sample can be well approximated by the span (i.e., the set of all linear combinations) of the selected samples, and then take these selected samples as the representative ones for manual labeling. However, the data do not necessarily conform to the linear models in many real-world scenarios, and how to model nonlinearity of data often becomes the key point of unsupervised active learning. Moreover, the existing works often aim to well reconstruct the whole dataset, while ignore the important cluster structure, especially for imbalanced data. In this paper, we present a novel deep unsupervised active learning framework. The proposed method can explicitly learn a nonlinear embedding to map each input into a latent space via a deep neural network, and introduce a selection block to select the representative samples in the learnt latent space through a self-supervised learning strategy. In the selection block, we aim to not only preserve the global structure of the data, but also capture the cluster structure of the data in order to well handle the data imbalance issue during sample selection. Meanwhile, we take advantage of the clustering result to provide self-supervised information to guide the above processes. Finally, we attempt to preserve the local structure of the data, such that the data embedding becomes more precise and the model performance can be further improved. Extensive experimental results on several publicly available datasets clearly demonstrate the effectiveness of our method, compared with the state-of-the-arts. |
Persistent Identifier | http://hdl.handle.net/10722/321985 |
ISSN | 2023 Impact Factor: 10.8 2023 SCImago Journal Rankings: 3.556 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Li, Changsheng | - |
dc.contributor.author | Ma, Handong | - |
dc.contributor.author | Yuan, Ye | - |
dc.contributor.author | Wang, Guoren | - |
dc.contributor.author | Xu, Dong | - |
dc.date.accessioned | 2022-11-03T02:22:48Z | - |
dc.date.available | 2022-11-03T02:22:48Z | - |
dc.date.issued | 2022 | - |
dc.identifier.citation | IEEE Transactions on Image Processing, 2022, v. 31, p. 2767-2781 | - |
dc.identifier.issn | 1057-7149 | - |
dc.identifier.uri | http://hdl.handle.net/10722/321985 | - |
dc.description.abstract | Unsupervised active learning has become an active research topic in the machine learning and computer vision communities, whose goal is to choose a subset of representative samples to be labeled in an unsupervised setting. Most of existing approaches rely on shallow linear models by assuming that each sample can be well approximated by the span (i.e., the set of all linear combinations) of the selected samples, and then take these selected samples as the representative ones for manual labeling. However, the data do not necessarily conform to the linear models in many real-world scenarios, and how to model nonlinearity of data often becomes the key point of unsupervised active learning. Moreover, the existing works often aim to well reconstruct the whole dataset, while ignore the important cluster structure, especially for imbalanced data. In this paper, we present a novel deep unsupervised active learning framework. The proposed method can explicitly learn a nonlinear embedding to map each input into a latent space via a deep neural network, and introduce a selection block to select the representative samples in the learnt latent space through a self-supervised learning strategy. In the selection block, we aim to not only preserve the global structure of the data, but also capture the cluster structure of the data in order to well handle the data imbalance issue during sample selection. Meanwhile, we take advantage of the clustering result to provide self-supervised information to guide the above processes. Finally, we attempt to preserve the local structure of the data, such that the data embedding becomes more precise and the model performance can be further improved. Extensive experimental results on several publicly available datasets clearly demonstrate the effectiveness of our method, compared with the state-of-the-arts. | - |
dc.language | eng | - |
dc.relation.ispartof | IEEE Transactions on Image Processing | - |
dc.subject | deep neural network | - |
dc.subject | imbalance data | - |
dc.subject | self-supervised learning | - |
dc.subject | structure preserving | - |
dc.subject | Unsupervised active learning | - |
dc.title | Structure Guided Deep Neural Network for Unsupervised Active Learning | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/TIP.2022.3161076 | - |
dc.identifier.pmid | 35344492 | - |
dc.identifier.scopus | eid_2-s2.0-85127467551 | - |
dc.identifier.volume | 31 | - |
dc.identifier.spage | 2767 | - |
dc.identifier.epage | 2781 | - |
dc.identifier.eissn | 1941-0042 | - |
dc.identifier.isi | WOS:000778905000001 | - |