File Download
There are no files associated with this item.
Supplementary
-
Citations:
- Scopus: 0
- Appears in Collections:
Conference Paper: TOWARDS FAST ADAPTATION OF NEURAL ARCHITECTURES WITH META LEARNING
Title | TOWARDS FAST ADAPTATION OF NEURAL ARCHITECTURES WITH META LEARNING |
---|---|
Authors | |
Issue Date | 2020 |
Citation | 8th International Conference on Learning Representations, ICLR 2020, 2020 How to Cite? |
Abstract | Recently, Neural Architecture Search (NAS) has been successfully applied to multiple artificial intelligence areas and shows better performance compared with hand-designed networks. However, the existing NAS methods only target a specific task. Most of them usually do well in searching an architecture for single task but are troublesome for multiple datasets or multiple tasks. Generally, the architecture for a new task is either searched from scratch, which is neither efficient nor flexible enough for practical application scenarios, or borrowed from the ones searched on other tasks, which might be not optimal. In order to tackle the transferability of NAS and conduct fast adaptation of neural architectures, we propose a novel Transferable Neural Architecture Search method based on meta-learning in this paper, which is termed as T-NAS. T-NAS learns a meta-architecture that is able to adapt to a new task quickly through a few gradient steps, which makes the transferred architecture suitable for the specific task. Extensive experiments show that T-NAS achieves state-of-the-art performance in few-shot learning and comparable performance in supervised learning but with 50x less searching cost, which demonstrates the effectiveness of our method. |
Persistent Identifier | http://hdl.handle.net/10722/345313 |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Lian, Dongze | - |
dc.contributor.author | Zheng, Yin | - |
dc.contributor.author | Xu, Yintao | - |
dc.contributor.author | Lu, Yanxiong | - |
dc.contributor.author | Lin, Leyu | - |
dc.contributor.author | Zhao, Peilin | - |
dc.contributor.author | Huang, Junzhou | - |
dc.contributor.author | Gao, Shenghua | - |
dc.date.accessioned | 2024-08-15T09:26:33Z | - |
dc.date.available | 2024-08-15T09:26:33Z | - |
dc.date.issued | 2020 | - |
dc.identifier.citation | 8th International Conference on Learning Representations, ICLR 2020, 2020 | - |
dc.identifier.uri | http://hdl.handle.net/10722/345313 | - |
dc.description.abstract | Recently, Neural Architecture Search (NAS) has been successfully applied to multiple artificial intelligence areas and shows better performance compared with hand-designed networks. However, the existing NAS methods only target a specific task. Most of them usually do well in searching an architecture for single task but are troublesome for multiple datasets or multiple tasks. Generally, the architecture for a new task is either searched from scratch, which is neither efficient nor flexible enough for practical application scenarios, or borrowed from the ones searched on other tasks, which might be not optimal. In order to tackle the transferability of NAS and conduct fast adaptation of neural architectures, we propose a novel Transferable Neural Architecture Search method based on meta-learning in this paper, which is termed as T-NAS. T-NAS learns a meta-architecture that is able to adapt to a new task quickly through a few gradient steps, which makes the transferred architecture suitable for the specific task. Extensive experiments show that T-NAS achieves state-of-the-art performance in few-shot learning and comparable performance in supervised learning but with 50x less searching cost, which demonstrates the effectiveness of our method. | - |
dc.language | eng | - |
dc.relation.ispartof | 8th International Conference on Learning Representations, ICLR 2020 | - |
dc.title | TOWARDS FAST ADAPTATION OF NEURAL ARCHITECTURES WITH META LEARNING | - |
dc.type | Conference_Paper | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.scopus | eid_2-s2.0-85150658364 | - |