File Download

There are no files associated with this item.

Supplementary

Conference Paper: Recurrence along Depth: Deep Convolutional Neural Networks with Recurrent Layer Aggregation

TitleRecurrence along Depth: Deep Convolutional Neural Networks with Recurrent Layer Aggregation
Authors
Issue Date2021
PublisherCurran Associates, Inc..
Citation
Neural Information Processing Systems (NeurIPS), v. 34, p. 10627-10640 How to Cite?
AbstractThis paper introduces a concept of layer aggregation to describe how information from previous layers can be reused to better extract features at the current layer. While DenseNet is a typical example of the layer aggregation mechanism, its redundancy has been commonly criticized in the literature. This motivates us to propose a very light-weighted module, called recurrent layer aggregation (RLA), by making use of the sequential structure of layers in a deep CNN. Our RLA module is compatible with many mainstream deep CNNs, including ResNets, Xception and MobileNetV2, and its effectiveness is verified by our extensive experiments on image classification, object detection and instance segmentation tasks. Specifically, improvements can be uniformly observed on CIFAR, ImageNet and MS COCO datasets, and the corresponding RLA-Nets can surprisingly boost the performances by 2-3% on the object detection task. This evidences the power of our RLA module in helping main CNNs better learn structural information in images.
Persistent Identifierhttp://hdl.handle.net/10722/320351

 

DC FieldValueLanguage
dc.contributor.authorZHAO, J-
dc.contributor.authorFANG, Y-
dc.contributor.authorLi, G-
dc.date.accessioned2022-10-21T07:51:41Z-
dc.date.available2022-10-21T07:51:41Z-
dc.date.issued2021-
dc.identifier.citationNeural Information Processing Systems (NeurIPS), v. 34, p. 10627-10640-
dc.identifier.urihttp://hdl.handle.net/10722/320351-
dc.description.abstractThis paper introduces a concept of layer aggregation to describe how information from previous layers can be reused to better extract features at the current layer. While DenseNet is a typical example of the layer aggregation mechanism, its redundancy has been commonly criticized in the literature. This motivates us to propose a very light-weighted module, called recurrent layer aggregation (RLA), by making use of the sequential structure of layers in a deep CNN. Our RLA module is compatible with many mainstream deep CNNs, including ResNets, Xception and MobileNetV2, and its effectiveness is verified by our extensive experiments on image classification, object detection and instance segmentation tasks. Specifically, improvements can be uniformly observed on CIFAR, ImageNet and MS COCO datasets, and the corresponding RLA-Nets can surprisingly boost the performances by 2-3% on the object detection task. This evidences the power of our RLA module in helping main CNNs better learn structural information in images.-
dc.languageeng-
dc.publisherCurran Associates, Inc.. -
dc.relation.ispartofNeural Information Processing Systems (NeurIPS)-
dc.titleRecurrence along Depth: Deep Convolutional Neural Networks with Recurrent Layer Aggregation-
dc.typeConference_Paper-
dc.identifier.emailLi, G: gdli@hku.hk-
dc.identifier.authorityLi, G=rp00738-
dc.identifier.hkuros339987-
dc.identifier.volume34-
dc.identifier.spage10627-
dc.identifier.epage10640-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats