File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Utilizing information bottleneck to evaluate the capability of deep neural networks for image classification

TitleUtilizing information bottleneck to evaluate the capability of deep neural networks for image classification
Authors
KeywordsImage classification
Information bottleneck
Mutual information
Neural networks
Issue Date2019
Citation
Entropy, 2019, v. 21, n. 5, article no. 456 How to Cite?
AbstractInspired by the pioneering work of the information bottleneck (IB) principle for Deep Neural Networks' (DNNs) analysis, we thoroughly study the relationship among the model accuracy, I(X; T) and I(T;Y), where I(X; T) and I(T;Y) are the mutual information of DNN's output T with input X and label Y. Then, we design an information plane-based framework to evaluate the capability of DNNs (including CNNs) for image classification. Instead of each hidden layer's output, our framework focuses on the model output T. We successfully apply our framework to many application scenarios arising in deep learning and image classification problems, such as image classification with unbalanced data distribution, model selection, and transfer learning. The experimental results verify the effectiveness of the information plane-based framework: Our framework may facilitate a quick model selection and determine the number of samples needed for each class in the unbalanced classification problem. Furthermore, the framework explains the efficiency of transfer learning in the deep learning area.
Persistent Identifierhttp://hdl.handle.net/10722/345247

 

DC FieldValueLanguage
dc.contributor.authorCheng, Hao-
dc.contributor.authorLian, Dongze-
dc.contributor.authorGao, Shenghua-
dc.contributor.authorGeng, Yanlin-
dc.date.accessioned2024-08-15T09:26:09Z-
dc.date.available2024-08-15T09:26:09Z-
dc.date.issued2019-
dc.identifier.citationEntropy, 2019, v. 21, n. 5, article no. 456-
dc.identifier.urihttp://hdl.handle.net/10722/345247-
dc.description.abstractInspired by the pioneering work of the information bottleneck (IB) principle for Deep Neural Networks' (DNNs) analysis, we thoroughly study the relationship among the model accuracy, I(X; T) and I(T;Y), where I(X; T) and I(T;Y) are the mutual information of DNN's output T with input X and label Y. Then, we design an information plane-based framework to evaluate the capability of DNNs (including CNNs) for image classification. Instead of each hidden layer's output, our framework focuses on the model output T. We successfully apply our framework to many application scenarios arising in deep learning and image classification problems, such as image classification with unbalanced data distribution, model selection, and transfer learning. The experimental results verify the effectiveness of the information plane-based framework: Our framework may facilitate a quick model selection and determine the number of samples needed for each class in the unbalanced classification problem. Furthermore, the framework explains the efficiency of transfer learning in the deep learning area.-
dc.languageeng-
dc.relation.ispartofEntropy-
dc.subjectImage classification-
dc.subjectInformation bottleneck-
dc.subjectMutual information-
dc.subjectNeural networks-
dc.titleUtilizing information bottleneck to evaluate the capability of deep neural networks for image classification-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.3390/e21050456-
dc.identifier.scopuseid_2-s2.0-85066605153-
dc.identifier.volume21-
dc.identifier.issue5-
dc.identifier.spagearticle no. 456-
dc.identifier.epagearticle no. 456-
dc.identifier.eissn1099-4300-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats