File Download
  Links for fulltext
     (May Require Subscription)
  • Find via Find It@HKUL
Supplementary

Conference Paper: Recurrence along Depth: Deep Convolutional Neural Networks with Recurrent Layer Aggregation

TitleRecurrence along Depth: Deep Convolutional Neural Networks with Recurrent Layer Aggregation
Authors
KeywordsConvolutional Neural Networks
ResNet
DenseNet
Recurrent Structures
Layer Aggregation
Issue Date2021
PublisherNeural Information Processing Systems Foundation, Inc. The Journal's web site is located at https://papers.nips.cc/
Citation
35th Conference on Neural Information Processing Systems (NeurIPS), Virtual Conference, 7-10 December 2021. In Ranzato, M ... et al (eds.), Advances in Neural Information Processing Systems 34 (NIPS 2021) pre-proceedings How to Cite?
AbstractThis paper introduces a concept of layer aggregation to describe how information from previous layers can be reused to better extract features at the current layer. While DenseNet is a typical example of the layer aggregation mechanism, its redundancy has been commonly criticized in the literature. This motivates us to propose a very light-weighted module, called recurrent layer aggregation (RLA), by making use of the sequential structure of layers in a deep CNN. Our RLA module is compatible with many mainstream deep CNNs, including ResNets, Xception and MobileNetV2, and its effectiveness is verified by our extensive experiments on image
DescriptionPoster Session 5 at Spot F0 in Virtual World
Persistent Identifierhttp://hdl.handle.net/10722/307994
ISSN
2020 SCImago Journal Rankings: 1.399

 

DC FieldValueLanguage
dc.contributor.authorZhao, J-
dc.contributor.authorFang, Y-
dc.contributor.authorLi, G-
dc.date.accessioned2021-11-12T13:40:54Z-
dc.date.available2021-11-12T13:40:54Z-
dc.date.issued2021-
dc.identifier.citation35th Conference on Neural Information Processing Systems (NeurIPS), Virtual Conference, 7-10 December 2021. In Ranzato, M ... et al (eds.), Advances in Neural Information Processing Systems 34 (NIPS 2021) pre-proceedings-
dc.identifier.issn1049-5258-
dc.identifier.urihttp://hdl.handle.net/10722/307994-
dc.descriptionPoster Session 5 at Spot F0 in Virtual World-
dc.description.abstractThis paper introduces a concept of layer aggregation to describe how information from previous layers can be reused to better extract features at the current layer. While DenseNet is a typical example of the layer aggregation mechanism, its redundancy has been commonly criticized in the literature. This motivates us to propose a very light-weighted module, called recurrent layer aggregation (RLA), by making use of the sequential structure of layers in a deep CNN. Our RLA module is compatible with many mainstream deep CNNs, including ResNets, Xception and MobileNetV2, and its effectiveness is verified by our extensive experiments on image-
dc.languageeng-
dc.publisherNeural Information Processing Systems Foundation, Inc. The Journal's web site is located at https://papers.nips.cc/-
dc.relation.ispartof35th Conference on Neural Information Processing Systems (NeurIPS), 2021-
dc.relation.ispartofAdvances in Neural Information Processing Systems 34 (NIPS 2021 Proceedings)-
dc.subjectConvolutional Neural Networks-
dc.subjectResNet-
dc.subjectDenseNet-
dc.subjectRecurrent Structures-
dc.subjectLayer Aggregation-
dc.titleRecurrence along Depth: Deep Convolutional Neural Networks with Recurrent Layer Aggregation-
dc.typeConference_Paper-
dc.identifier.emailLi, G: gdli@hku.hk-
dc.identifier.authorityLi, G=rp00738-
dc.description.naturepublished_or_final_version-
dc.identifier.hkuros329473-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats