File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TIP.2020.3028288
- Scopus: eid_2-s2.0-85096457015
- PMID: 33035163
- WOS: WOS:000591830600002
Supplementary
- Citations:
- Appears in Collections:
Article: Block Proposal Neural Architecture Search
Title | Block Proposal Neural Architecture Search |
---|---|
Authors | |
Issue Date | 2021 |
Citation | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society, 2021, v. 30, p. 15-25 How to Cite? |
Abstract | The existing neural architecture search (NAS) methods usually restrict the search space to the pre-defined types of block for a fixed macro-architecture. However, this strategy will limit the search space and affect architecture flexibility if block proposal search (BPS) is not considered for NAS. As a result, block structure search is the bottleneck in many previous NAS works. In this work, we propose a new evolutionary algorithm referred to as latency EvoNAS (LEvoNAS) for block structure search, and also incorporate it to the NAS framework by developing a novel two-stage framework referred to as Block Proposal NAS (BP-NAS). Comprehensive experimental results on two computer vision tasks demonstrate the superiority of our newly proposed approach over the state-of-the-art lightweight methods. For the classification task on the ImageNet dataset, our BPN-A is better than 1.0-MobileNetV2 with similar latency, and our BPN-B saves 23.7% latency when compared with 1.4-MobileNetV2 with higher top-1 accuracy. Furthermore, for the object detection task on the COCO dataset, our method achieves significant performance improvement than MobileNetV2, which demonstrates the generalization capability of our newly proposed framework. |
Persistent Identifier | http://hdl.handle.net/10722/321911 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Liu, Jiaheng | - |
dc.contributor.author | Zhou, Shunfeng | - |
dc.contributor.author | Wu, Yichao | - |
dc.contributor.author | Chen, Ken | - |
dc.contributor.author | Ouyang, Wanli | - |
dc.contributor.author | Xu, Dong | - |
dc.date.accessioned | 2022-11-03T02:22:17Z | - |
dc.date.available | 2022-11-03T02:22:17Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society, 2021, v. 30, p. 15-25 | - |
dc.identifier.uri | http://hdl.handle.net/10722/321911 | - |
dc.description.abstract | The existing neural architecture search (NAS) methods usually restrict the search space to the pre-defined types of block for a fixed macro-architecture. However, this strategy will limit the search space and affect architecture flexibility if block proposal search (BPS) is not considered for NAS. As a result, block structure search is the bottleneck in many previous NAS works. In this work, we propose a new evolutionary algorithm referred to as latency EvoNAS (LEvoNAS) for block structure search, and also incorporate it to the NAS framework by developing a novel two-stage framework referred to as Block Proposal NAS (BP-NAS). Comprehensive experimental results on two computer vision tasks demonstrate the superiority of our newly proposed approach over the state-of-the-art lightweight methods. For the classification task on the ImageNet dataset, our BPN-A is better than 1.0-MobileNetV2 with similar latency, and our BPN-B saves 23.7% latency when compared with 1.4-MobileNetV2 with higher top-1 accuracy. Furthermore, for the object detection task on the COCO dataset, our method achieves significant performance improvement than MobileNetV2, which demonstrates the generalization capability of our newly proposed framework. | - |
dc.language | eng | - |
dc.relation.ispartof | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society | - |
dc.title | Block Proposal Neural Architecture Search | - |
dc.type | Article | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/TIP.2020.3028288 | - |
dc.identifier.pmid | 33035163 | - |
dc.identifier.scopus | eid_2-s2.0-85096457015 | - |
dc.identifier.volume | 30 | - |
dc.identifier.spage | 15 | - |
dc.identifier.epage | 25 | - |
dc.identifier.eissn | 1941-0042 | - |
dc.identifier.isi | WOS:000591830600002 | - |